Quality Assurance and Project Management


September 17, 2019  3:46 PM

Customer Statements About Next-Gen Zoho One @Zoho

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Business transformation

This is the last post in the series covering what customers have to say about Next-Gen Zoho One and also the pricing part. The first post in the series is Next-Gen Zoho One Can Transform Enterprise Journey @zoho. The second post is Business Transformation Is The Mantra of Next-Gen Zoho One @Zoho and the penultimate one is Key Features of The Bold New Operating-System Services @Zoho. Now, let us see first what few of the prestigious customers of Next-Gen Zoho One have to say about it and how do they transform the business with it.

Customer Speaks:

Pranesh Padmanabhan, Founder & CEO, Studio 31 says,

“Zoho One as a concept has fulfilled the dream of Studio 31 in becoming a fully-fledged, technology-enabled wedding photography and film company. It has given us the confidence and strength that we can sustain in this highly competitive, unorganized industry by not worrying about manual administration work anymore and fearing about human errors; it’s a very sensitive industry and every error will cost a client long memory. Now that we have a SaaS product running our business, we’ve got brilliant ideas and we’ve got the platform and time to make high-level strategic decisions to grow bigger, better and truly be one of the most-organized businesses in this sector.”

Sonia Bhadoriya, Head of Business Development, Eurokids says,

“We switched from Salesforce to Zoho One because of the fluidity of data across apps that allowed us to connect our departments. We use Zoho CRM, Sign and Creator extensively. The analytics tool enables us to visualize big data with graphs and charts. We love it!”

Next-Gen Zoho One

Niki Kushe, Group Head – CRM, India Infoline Finance Ltd says

“IIFL is a leading financial services conglomerate serving over 4 million satisfied customers around the globe. Though an established organization, we are constantly on the look for ways to build our strength and to deliver excellent service to our ever-expanding customer base. At IIFL, we have started using Zoho One, which includes their super-powered CRM, Email, Campaign Management, Survey, Social Media Management, Sales IQ, Creator, Internal Chat and HR products across various entities. Honestly, there is no better value than Zoho One can offer, especially at this low a cost! Zoho One does change the way businesses operate by offering a whole suite of apps that are not only tightly integrated with each other but also play well with third-party applications. These vast varieties of solutions are easy to configure and customize which in itself paves for an efficient cross-selling platform.”

Pricing:

The story remains unfinished if I don’t mention the pricing for Zoho One. The pricing, in fact, is very simple and is pocket-friendly for any industry segment of any size. And there are no hidden costs to it. It’s Rs. 1500 per employee or RS. 3000 per user. The ROI in case of Zoho One is multifold and quick as all the new features mentioned above come free of cost with Zoho One.

September 17, 2019  3:34 PM

Key Features of The Bold New Operating-System Services @Zoho

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Business Process, provisioning, Single sign-on, Telephony

This is the third post in the series of Zoho One Next-Gen features. You can access the first post here – Next-Gen Zoho One Can Transform Enterprise Journey @zoho and the second post here – Business Transformation Is The Mantra of Next-Gen Zoho One @Zoho. In this post, we shall be discussing some of the key features offered by the Bold New Operating-System of the Businesses that is Zoho One.

Now let us look at some of the key features offered by the Bold New Operating-System Services as below:

  • 1. Communication: PhoneBridge
  • – PhoneBridge is Zoho’s newer entry in the bouquet of applications in Zoho One. It is, in fact, Zoho’s telephony platform that enables telephony in Zoho apps. For instance, in Zoho CRM this PhoneBridge integration permits users to make calls from Zoho apps. Not only that it also provides contextual information on incoming calls. What it means is, if PhoneBridge is enabled in Zoho CRM it gives users context for all incoming calls not only from Zoho CRM but also from Zoho Recruit, Zoho Mail, and 20+ other apps.

  • 2. Single Sign-On:
  • Single Sign-On (SSO), another new service from Zoho One enables customers to integrate any third-party applications with their account. As a matter of fact, Zoho Single Sign-On currently supports more than 50 third party applications and account is increasing on a regular basis. This third party application integration can be done in two ways. It can be done either individual user wise or groups wise.

  • 3. App Management and Provisioning
  • : As of now Zoho One allows provisioning for all of its 45 plus apps.

    Bold New Operating-System

  • 4. Business Workflow Management
  • : Orchestly is the new Innovative and intuitive drag and drop interface helping managers or the process owners to define processes effortlessly without any technical knowledge in coding. There are ample practical examples of this like Purchase Approvals, Content Publishing, Asset Management, Onboarding, and so on.

  • 5. Zoho Sign:
  • Zoho Sign builds an additional level of validation for customers with the help of blockchain-based timestamping through Ethereum. Ethereum, as we all know, is a globally accepted open-source platform. So when a document is signed using Zoho Sign, an Ethereum transaction happens in the background, actually. The hash of the signed document is added to the transaction notes of that Ethereum transaction happening in the background.

    Next post is the concluding post in this series. Continued »


    September 17, 2019  3:16 PM

    Business Transformation Is The Mantra of Next-Gen Zoho One @Zoho

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Business transformation

    This is the second post in the series where we are talking about Next-Gen new features and how it empowers its customers in a multifaceted manner focusing on multi-directional business growth. The first post can be accessed here. We are talking about the power of Orchestly interface. This interface is so easy to manage that even the non-tech business process owners of various departments can build their own processes without any help from the technology department. Within 2 years of its launch, Zoho One has achieved substantial growth and is now catering to more than 20,000 customers across the globe. The new next-gen features added to the operating system include a new application that connects every corner of business operations, a number of updates in the existing modules, and an overall unmatched performance how the complete business suite.

    If we look at the statistics of Zoho One usage around 25% of its customers use more than 25 applications, and more than 50% are using 16 or more applications from the bouquet of applications on Zoho One platform. This itself proves how badly businesses are craving for an all in one business solution thereby to reduce the complexity of managing a large number of vendors and consistently arising conflicts out of the integration of multi-vendor multi-platform business applications. The speed at which Zoho One is getting adopted in the market among different business verticals is a proof of a major shift in customer expectation and rejection of complexity in lieu of easy to use All-in-One platforms to get immense value out of it.

    Business Transformation with Next-Gen Zoho One

    Rajendran Dandapani, Director of Technology, Zoho says,

    “Technology is supposed to help businesses. Instead, it has evolved into complex beast customers have to tame—from juggling apps from multiple vendors to trying to solve the multi-app integration puzzle to dealing with vendors forcing customers into expensive, lengthy contracts. The technology industry has gone too far down this path and this has to change. With Zoho One, we want to change all of that. It’s a technology platform to run your entire business with a trustworthy vendor that is easy to do business with. With Zoho One, you are not just licensing the technology. You are licensing peace of mind.”

    Don’t miss the next post in this series.


    September 17, 2019  2:29 PM

    Next-Gen Zoho One Can Transform Enterprise Journey @zoho

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Blockchain, Business applications, Business Process Automation, Single sign-on, Telephony

    Zoho is truly fitting in what it says ‘the operating system for businesses’. Zoho One is, in fact, a complete business solution with a high level of material and high focus on all business levels together. With the launch of Zoho Next-Gen Zoho One, the whole ecosystem has become much stronger with a power to deliver more to the business across all the user and management levels. The operating system for businesses has got empowered with a process automation app, telephony, single sign-on, and blockchain capabilities. This next-gen features or tools enable it to achieve greater heights in customer adoption and business acceptability.

    While the other business applications in the market focus on money for licensing and making their system so complex to an extent that their customers are forced to rely on them or rather stay at their helm instead of hefty annual maintenance costs and licensing costs. On the other hand, every new level of Zoho One is getting more powerful, more flexible, more business-friendly thereby leveraging with the new powerful features at no additional cost to its customers. Every new feature added becomes an integral part of Zoho One thus improving its quality and making it more powerful and stable from the business point of you. This is proven by the fact that the popularity of Zoho One is increasing exponentially and so is its customer base in India as well as globally.

    Next-Gen Zoho

    The next generation of Zoho One is designed and built to take care of entire business operations including sales and marketing, finance and HR, operations and business intelligence, and so on. All this runs on a unified technology platform. The new business workflow management application that they named as Orchestly empowers Zoho customers to effortlessly create, manage, and optimize their business processes with the help of an intuitive drag and drop interface.

    We shall be continuing in the next post featuring more features of Next-Gen Zoho One. Continued »


    September 16, 2019  9:26 AM

    INTELLIGENT SEARCH CAPABILITIES @cloudtenna and @nasuni

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud search, Cloud storage, Nasuni, Object storage

    Technology has matured to a level of collaboration and innovation. High-end tech companies have understood that very well. A recent example in this regard is a partnership between Cloudtenna with Nasuni. As a result of this partnership, enterprises will be able to get intelligent search capabilities. In fact, this is a boon for existing Nasuni clients. Now the can achieve instant success in finding your file from a multiple cloud environment they are using. And all this happens with a guarantee of compliance with enterprise file access rules. As we are well aware that Cloudtenna is one of the front runners in enterprise search technology. It announced today strategic partnership with Nasuni. This partnership results in an unmatched integration of Nasuni Cloud File Services with Cloudtenna’s DirectSearch. This, in turn, empowers enterprises to augment their enterprise-wide collaboration efforts with intelligent search capabilities.

    INTELLIGENT SEARCH

    Direct search is an outstanding recommendation engine which is empowered with machine learning and is capable of finding results with the lightning speed with the help of its sub-second search queries features that works on massive distributed datasets and including millions of items. For enterprises in today’s data-intensive environment file search has become a singular priority to increase employees productivity especially in an environment where the workforce is working in a distributed manner. Because enterprises have understood it well that to increase the productivity of their employees the level of collaboration between their employees based in various offices around the globe is very important. As I mentioned in one of my previous posts, recent research from IDC says an organisation having 1000 knowledge workers on board wastes around dollar 50000 per week. That comes to almost $2.5 million per annum.

    INTELLIGENT SEARCH

    All this happens due to employee’s failure to locate and retrieve electronic files well in time. In a few cases, it could lead to higher losses in terms of financial transactions as well as business reputation.

    Will Hornkohl, vice president of alliances at Nasuni says,

    “For enterprise users, it can be a challenge to find the exact file they are looking for across multiple clouds, not to mention on-premises servers. Bringing Cloudtenna into our growing partner community will ensure that users at the organizations we serve can always find exactly what they are looking for quickly and easily within the entirety of their global file share – all while using a single login for all file sources and while conducting intelligent searches that reflect personalized contextual insights that are modeled on each individual user’s file activity, history, teams and relationships. And of course, Nasuni and Cloudtenna both are built with safeguards that ensure complete compliance with all file access rules and protocols.”

    INTELLIGENT SEARCH

    There are around 500 enterprises that rely on Nasuni. With this significant collaboration, Nasuni has empowered enterprises to reap the highest level of benefits of cloud object storage which includes the unlimited capacity and inherent resiliency. As a matter of fact, this also changes the definition of ‘economy of the cloud’ for them. This will definitely enhance their control on performance from network-attached storage (NAS).

    INTELLIGENT SEARCH

    Aaron Ganek, CEO at Cloudtenna says,

    “Nasuni empowers organizations of all kinds not only to use cloud object storage in all of its flavors for primary storage of their files but also makes an unprecedented degree of global collaboration possible. Search capabilities are even more important when you’re looking at organizations like those that rely on Nasuni, which in many cases not only have massive datasets and equally goliath global file shares but also have employees who need to access a specific file they worked on with a colleague who’s literally on the other side of the world.”

    Ganek adds,

    “The Cloudtenna DirectSearch platform is uniquely designed to tackle distributed datasets, making it the ideal solution for Nasuni’s hybrid cloud file services platform. File search infrastructure faces a unique set of requirements that goes beyond the footprint of traditional search infrastructure used for log-search and site-search. It has to be smart enough to reflect accurate file permissions. It has to be smart enough to derive context to boost search results and has to do all this in a fraction of second.”

    To conclude Nasuni is now Cloudtenna’s tier-one supported data source. In fact, Cloudtenna is also certified as a third-party integration which is available to customers.


    August 30, 2019  2:30 PM

    Predictive Analytics and Data Mining @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Amazon, Data Mining, Predictive Analytics

    Predictive Analytics and Data Mining: Concepts and Practice with RapidMiner by Vijay Kotu and Bala Deshpande

    Book Excerpt as on Amazon.com:

    Put Predictive Analytics into Action Learn the basics of Predictive Analysis and Data Mining through an easy to understand the conceptual framework and immediately practice the concepts learned using the open-source RapidMiner tool. Whether you are brand new to Data Mining or working on your tenth project, this book will show you how to analyze data, uncover hidden patterns and relationships to aid important decisions and predictions. Data Mining has become an essential tool for any enterprise that collects, stores and processes data as part of its operations. This book is ideal for business users, data analysts, business analysts, business intelligence and data warehousing professionals and for anyone who wants to learn Data Mining. You’ll be able to: 1. Gain the necessary knowledge of different data mining techniques, so that you can select the right technique for a given data problem and create a general-purpose analytics process. 2. Get up and running fast with more than two dozen commonly used powerful algorithms for predictive analytics using practical use cases. 3. Implement a simple step-by-step process for predicting an outcome or discovering hidden relationships from the data using RapidMiner, an open-source GUI based data mining tool

    Predictive analytics and Data Mining techniques covered: Exploratory Data Analysis, Visualization, Decision trees, Rule induction, k-Nearest Neighbors, Naïve Bayesian, Artificial Neural Networks, Support Vector machines, Ensemble models, Bagging, Boosting, Random Forests, Linear regression, Logistic regression, Association analysis using Apriori and FP Growth, K-Means clustering, Density-based clustering, Self Organizing Maps, Text Mining, Time series forecasting, Anomaly detection and Feature selection. Implementation files can be downloaded from the book companion site at www.LearnPredictiveAnalytics.com

    Demystifies data mining concepts with easy to understand language
    Shows how to get up and running fast with 20 commonly used powerful techniques for predictive analysis
    Explains the process of using open source RapidMiner tools
    Discusses a simple 5 step process for implementing algorithms that can be used for performing predictive analytics
    Includes practical use cases and examples

    “If learning-by-doing is your mantra — as well it should be for predictive analytics — this book will jumpstart your practice. Covering a broad, foundational collection of techniques, authors Kotu and Deshpande deliver crystal-clear explanations of the analytical methods that empower organizations to learn from data. After each concept, screenshots make the ‘how to’ immediately concrete, revealing the steps needed to set things up and go; you’re guided through real hands-on execution.”

    –Eric Siegel, Ph.D., founder of Predictive Analytics World and author of Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die


    August 30, 2019  2:19 PM

    A Guide to Delivering Business Results with Big Data Fast @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Big Data

    Actionable Intelligence: A Guide to Delivering Business Results with Big Data Fast!
    by Keith B. Carte and Donald Farmer’

    Excerpt as on Amazon.com

    Building an analysis ecosystem for a smarter approach to intelligence
    Keith Carter’s Actionable Intelligence: A Guide to Delivering Business Results with Big Data Fast! is the comprehensive guide to achieving the dream that business intelligence practitioners have been chasing since the concept itself came into being. Written by an IT visionary with extensive global supply chain experience and insight, this book describes what happens when team members have accurate, reliable, usable, and timely information at their fingertips. With a focus on leveraging big data, the book provides expert guidance on developing an analytical ecosystem to effectively manage, use the internal and external information to deliver business results.

    This book is written by an author who’s been in the trenches for people who are in the trenches. It’s for practitioners in the real world, who know delivering results is easier said than done – fraught with failure, and difficult politics. A landscape where reason and passion are needed to make a real difference.

    This book lays out the appropriate way to establish a culture of fact-based decision making, innovation, forward-looking measurements, and appropriate high-speed governance. Readers will enable their organization to:

    Answer strategic questions faster
    Reduce data acquisition time and increase analysis time to improve outcomes
    Shift the focus to positive results rather than past failures
    Expand opportunities by more effectively and thoughtfully leveraging information
    Big data makes big promises, but it cannot deliver without the right recipe of people, processes and technology in place. It’s about choosing the right people, giving them the right tools, and taking a thoughtful—rather than formulaic–approach. Actionable Intelligence provides expert guidance toward envisioning, budgeting, implementing, and delivering real benefits.

    “Actionable Intelligence has the critical insights business leaders need to leverage Big Data to win in the emerging digital marketplace! Well done, Keith!”

    —Ed Hunter, Vice President, Product Supply-Asia, Procter & Gamble Europe SA – Singapore


    August 30, 2019  2:15 PM

    Executing Data Quality Projects : Book on @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data quality

    Executing Data Quality Projects: Ten Steps to Quality Data and Trusted Information (TM) 1st Edition by Danette McGilvray

    Excerpt as on Amazon.com

    Executing Data Quality Projects presents a systematic, proven approach to improving and creating data and information quality within the enterprise.

    Recent studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions.

    Data Quality

    This book describes a Ten-Step approach that combines a conceptual framework for understanding information quality with the tools, techniques, and instructions for improving and creating information quality. It includes numerous templates, detailed examples, and practical advice for executing every step of the approach. It allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices.

    The author’s trademarked approach, in which she has trained Fortune 500 clients and hundreds of workshop attendees, applies to all types of data and all types of organizations.

    Includes numerous templates, detailed examples, and practical advice for executing every step of The Ten Steps approach.
    Allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices.
    A companion Web site includes links to numerous data quality resources, including many of the planning and information-gathering templates featured in the text, quick summaries of key ideas from The Ten Step methodology, and other tools and information that is available online.

    This book is a gem. Tested, validated and polished over a distinguished career as a practitioner and consultant, Danette’s Ten Steps methodology shines as a unique and much-needed contribution to the information quality discipline. This practical and insightful book will quickly become the reference of choice for all those leading or participating in information quality improvement projects. Experienced project managers will use it to update and deepen their knowledge, new ones will use it as a roadmap to quickly become effective. Managers in organizations that have embraced generic improvement methodologies such as six sigma, lean or have developed internal ones would be wise to hand this book to their Black Belts and other improvement leaders.

    – C. Lwanga Yonke, Information Quality Practitioner.


    August 30, 2019  12:26 PM

    An Excellent Application and Infrastructure Monitoring Tool @Site24x7

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Application monitoring, application performance monitoring, IT Infrastructure Monitoring, Server monitoring

    What I feel is Site24x7 will soon be the top layer application and infrastructure monitoring tool for all CIOs and CTOs irrespective of the next layers of support mechanisms they have in place. Site24x7 has a lot of features that are unmatched with any of the existing global standard monitoring tools. Firstly, none have all the features it has. Secondly, it beats all others in one field or the other. And that is why I call it to be the most suitable and promising top layer application and infrastructure monitoring tool. Let us learn a little more about Site24x7. It’s very lightweight in comparison to its peers in the market from other companies. Also, it is a non-intrusive agent. What that means is that the Site24x7 monitoring agent runs almost silently and with minimal dependencies. It consumes very little memory on a system.

    application and infrastructure monitoring tool

    Data in Site24x7 application and infrastructure monitoring tool comes through log files, APIs, and simple commands. The overall proposition is quite cost-effective. That means, with it, you can monitor and manage your infrastructure and the servers on a wide range of parameters starting from basic availability to server cluster troubleshooting. The pricing has nothing to do with the physical configuration of your server. It indicates, that its pricing is not based on the number of cores, that mostly happens in the market to exaggerate the costing model. Its costing also doesn’t consider the RAM size or the number of hours your server needs to run. It empowers you to monitor and control your complete server infrastructure in a holistic manner without any tension of the financial implications. That makes it the first choice for enterprises of any size and volume.

    Infrastructure Monitoring Tool

    application and infrastructure monitoring tool

    Source: site24x7

    Tell me, how many other tools in the market are so powerful with so much flexibility and least costing model. Site24x7 is a modern application and infrastructure monitoring tool that is a SaaS service. It is already serving thousands of businesses and millions of users across the globe. With its new launches of Site24x7 Signals, AI-based intelligent dashboards & Security, This state-of-the-art application and infrastructure monitoring tool have created a new landmark in this field.


    August 29, 2019  9:42 AM

    How GOFRUGAL Is The Best ERP for Trade and Supply Chain @gofrugaltech

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    ERP, Supply chain, trade

    In the previous post, I covered the multifaceted achievements of Kumar Vembu. In this post, we shall be learning more about GOFRUGAL ERP. GOFRUGAL ERP is a complete, comprehensive and collaborative core business application to take care of entire trade and supply chain ecosystem. It takes care of 100% transaction automation collaborating suppliers, customers, partners, and other stakeholders in the ecosystem. GOFRUGAL ERP intensively evolved with an in-depth understanding of the minutest of customer needs and their pain points. It is available in different flavors like on-premise, in the cloud or on mobile. Before buying it a customer can have a free trial to play around the solutions and experience its unbeatable simplicity. Its key features and benefits include handling customers’ omnichannel businesses, customizable, ML-driven supply chain, autopilot with AI-driven decisions. In addition, it easily integrates into the current complex IT environment as per the requirement of backward forward integration.

    GOFRUGAL

    GOFRUGAL customers include businesses of various sizes like small or independent stores, large stores, regional local chains or national chains. During the last 18 years of its evolution, it has catered to more than 40 plus business formats that include supermarkets, pharmacies, salons, Apparel stores, restaurants, bakeries, FMCG, and pharma. Let’s see what is latest in GOFRUGAL. Recently GOFRUGAL launched GoSecure which is a real-time backup as a service for traders. More than 90% of retail and distribution businesses don’t have a data backup solution in place that means they are running with a high risk of backing up anytime. Most of these businesses are not too serious about the safety of their data. GOSecure’s real-time backup as a service automatically backs up every transaction that takes place on the system to a cloud-based service in a secured manner.

    GOFRUGAL

    That means in case of any kind of contingencies user can restore the data without any technical support requiring just and OTP.


    August 29, 2019  9:30 AM

    Kumar Vembu and His Entrepreneurial Journey @gofrugaltech

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud ERP, ERP, retail

    Kumar Vembu is a veteran entrepreneur. He has expertise in technology. Kumar is CEO and founder of GOFRUGAL. He wears multiple hats. He is an entrepreneur, investor farmer, yoga expert, and a lot more. Started his career with Qualcomm at San Diego. In 1995 he moved back to India to start his entrepreneurial journey. As a matter of fact, before starting his venture, he worked with large Telecom and networking companies like IITM, HCL, HP, and TeNet. While working, he was always striving to launch a world-class software product company in India and that is when he co-founded Zoho Corp with his brother Sridhar Vembu in 1995 and another company Delium in 2016. In between these two companies he launched GOFRUGAL in 2004 with a mission to make ERP easy and usable by all businesses of any size.

    Kumar Vembu and GOFRUGAL

    As of today, GOFRUGAL’s multiple solutions are running successfully in more than 30000 brands across more than 40 business formats with a sole purpose to transform business operations digitally to gain better customer experience, higher sales thereby acquiring more customers. Let us learn more about GOFRUGAL. GOFRUGAL is a digital-first company. It was founded in 2004 with a singular goal of empowering retail, restaurant, and distribution businesses with the help of appropriate ERP tools that empowers them to achieve greater levels of profitability and efficiency. The key focus of all GOFRUGAL applications stays on customer experience and mobility because these two are the new drivers for the growth of any business of any capacity. That is how GOFRUGAL empowers businesses to go Digital to stay ahead of their competition and sustain enough agility to grow and succeed continuously.

    Kumar Vembu

    We shall be continuing about GOFRUGAL in the concluding post. There are many more interesting facts about this wonderful venture initiated by Kumar Vembu.


    August 19, 2019  10:58 PM

    Care For Data Accuracy? Go For Golden-Record-As-A-Service @Naveego

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data, Master data management, MDM

    With the launch of next-generation data accuracy platform by Naveego, it has marked another landmark in the respective field. This NextGen data accuracy platform with self-service MDM and advanced security features shows Golden record across the comprehensive Enterprise data systems. In fact, this marks the launch of powerful Golden-Record-As-A-Service offering thus eliminating the need for costly IT resources. As a matter of fact, it results in 5x faster deployment and a humongous 80% cost savings over legacy solutions. In fact, in a short span of time Naveego has emerged as a leader in cloud-first distributed Data Accuracy solutions. With its recent launch of the next generation of its complete data accuracy platform Naveego has set itself at a much higher stand as compared to its counterparts in the market. It comes bundled with self-service Master Data Management (MDM) and Golden-Record-as-a-Service (GRaaS).

    Rather, it has become a boon for non-technical business users to manage Technology in a much easier and more accurate manner. This powerful offering empowers non-technical executives in an organization to acquire the data that they may require for advanced analytics without any intervention of somebody from the IT department or needing any professional services. GRaaS has many more powerful benefits to help business see itself rise to new heights. For instance, it ensures that there exists a single version of data for all business verticals in an organization. They can easily bank on the same data for their respective analytics and reporting. Interestingly, it results in 80% reduction in cost. Also, the implementation goes 5 times faster than legacy solutions.

    Data Accuracy – How Critical?

    In fact, this next-generation platform consists of an advanced patent-pending security mechanism that ensures merging and checks consistency without decryption of data or even any requirement of having platform access to the encryption key. The best thing is it does not require any customization or infrastructure change for that matter that results in a low cost of ownership (TCO). Moreover, as it is a complete solution It eliminates the requirement of highly skilled individuals to implement and maintain this system. That gives another major benefit to an enterprise. As we all know data is expanding in any organization at a tremendous speed exponentially. This is because of adoption of latest technologies like artificial intelligence (AI), machine learning (ML), the internet of things (IoT), heterogeneous devices including mobile devices, autonomous vehicles, and a large number of other sources in the ecosystem that are outside of traditional data centers.

    data accuracy

    All these have emerged with a very essential requirement of data cleansing which is becoming quite cumbersome for enterprises as well as highly expensive. On average the annual cost to organizations is  $15 million for maintaining bad data according to Gartner. As a matter of fact, in addition to this, there are other heavy costs that have become a big headache for enterprises. This includes the high price of legacy systems, customization of existing systems, etc. A debt of $3.1 trillion is there every year on the US economy. For an individual organization, it might look irrelevant but that is not the case. A company with, for instance, 50,000 incorrect records will have to incur a cost of $5 million per year to maintain those at approximately $100 per incorrect recordhttps://hbr.org/2017/09/only-3-of-companies-data-meets-basic-quality-standards.

    Data Accuracy

    In lack of a proper mechanism in place, this cost of maintaining incorrect data will keep on rising exponentially every year looking at the speed at which the data is increasing. Another burning issue for organizations is scrubbing and prepping of data for which organizations have to hire or outsource high-wage data scientists. Evidently, reports indicate that 80% of the time of these high-wage data scientists goes in collecting and cleansing inaccurate digital data. Without this cleansing of inaccurate digital data, an organization cannot use it for analysis purposes. This, Naveego terms as ‘data janitor work‘ which doesn’t match to the skills of data scientists but unfortunately this kind of work eats out most of their time whereas they are hired for focusing on the highly skilled job of data analysis.

    That in fact creates a vicious circle for the organizations from which they will never be able to come out and ultimately will succumb to it sooner or later unless they adopt a powerful system like Naveego’s next-generation Data Accuracy platform with Self-Service MDM and Advanced Security Features to ensure Golden Record across all Enterprise Data Systems. Now let us understand how Naveego explains the emergence and importance of Golden record. The complete data accuracy platform that Naveego provides supports hybrid and multi-cloud environments providing distributed Data accuracy solution. It in fact proactively manages, identifies, and eliminates any kind of customer data accuracy problems across all enterprise data sources thus resulting in a single Golden record thereby ensuring data consistency across the Enterprise. In turn, it eliminates any chances of data lakes from becoming data swamps.

    Data Accuracy

    The solution talks to Kubernets, Apache Kafka, and Apache Spark technologies thereby ensuring rapid deployment distributed processing and flawless integration with data. This data may be residing anywhere in the cloud or on-premise/off-premise. The matter of fact is it supports all kind of hybrid and multi-cloud environments. Naveego ensures the data accuracy of any volume with realtime streaming from multiple data sources in any environment irrespective of its schema or structure. The key features of Naveego’s Next Generation Data Accuracy Platform include Self Service, Golden-Record-as-a-service, Golden Record C, Automated Profiling of Data Sources at the edge (machine learning), Automated Profiling of any Data Source including IoT, Automatic Data Quality Checks driven by Machine Learning and so on.

    Michael Ger, General Manager, Automotive and Manufacturing Solutions, Cloudera says, “Companies across all industries are reimagining themselves within a digitally transformed future. Central to that future is leveraging a data tsunami resulting from newly connected consumers, products and processes. Within this context, data quality has taken on critical new importance. The Naveego data accuracy platform is critical for enabling traditional approaches to business intelligence as well as modern-day big data analytics. The reason for this is clear – actionable insights start with clean data, and that’s exactly what the Naveego platform delivers.”

    Data Accuracy

    Katie Horvath, CEO, Naveego says, “The ability to achieve golden record data has typically been available only by hiring a systems integrator or other specialist, at a high cost and TCO to the enterprise. The next generation of our Data Accuracy Platform is truly a game-changer, empowering business users to access trusted data across all data types for analytics purposes, entirely on their own with an easy to use, flow-oriented user interface – and at a significantly lower cost. This is sure to disrupt pricey legacy solutions that require vast amounts of professional resources and on average five times longer to deploy.”


    August 19, 2019  9:55 PM

    @Tachyum Joins PCI-SIG for Ultimate Performance of Data Center

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    PCI SIG, PCIe, PCIe SSD

    Tachyum joins PCI-SIG in order to optimize performance requirements of data center AI and HPC workloads. This highly scalable high-speed I/O solution by Tachyum is applicable to numerous market applications empowered by Tachyum’s processor Technology. Let us first understand what PCI and SIG stand for. PCI stands for peripheral component interconnect. SIG stands for a special interest group. As a matter of fact, this PCI-SIG comprises of over 700 member association that is committed for the advancement of the non-proprietary PCI Technology for high speed I/O in various market applications. As a matter of fact, the PCI expansion bus in today’s environment is by default and interconnect between CPUs and peripherals. With an increasing demand in higher performance I/O, scope and ecosystem reach of PCI expands tremendously. In fact, latest technology road maps including PCI Express focus on new form factors and lower-power applications.

    PCI-SIG

    There comes the role of association members who collaborate in open communities for the purpose of defining, testing, and refining specifications so that companies can bring to market PCI-compliant devices. As the innovation in PCIe Technology grows there is a continuous doubling of bandwidth availability to graphic cards, hard drives, Wi-Fi, Ethernet cards, SSDs, and so on. You must know that the fourth generation of the PCIe standard supports bandwidth capabilities of 64 GB per second. This if you notice is twice that of the PCIe 3.0 interface. Now let us understand that capabilities are of PCIe 5.0. it will double bandwidth rates to 128 GB per second. Tachyum is doing a great job in integrating PCIe based on customer requirements for storage, peer-to-peer clusters, AI, and as endpoints for accelerator applications.

    Tachyum Joins PCI-SIG

    With these advances to PCIe besides performance improvements, improved the support of the memory coherency created into the standard that protocols like CCIX and CXL get to multi-core processors. This ensures that all copies of data stay in a coherency. As a matter of fact, Tachyum’s Prodigy Universal Processor Chip empowers industry-leading advances in performance, energy consumption, space requirements, and data center server utilization. This happens in fully coherent multiprocessor environments utilizing PCIe. Amazingly Tachyum’s Prodigy Universal processor Chip is the smallest and fastest general-purpose 64-core processor produced till date. For that reason, it requires 10x less processor power and reduces processor cost by 3x. The Prodigy in fact directly enables 32 Tensor Exaflop supercomputer thereby allowing the creation of systems more powerful than the human brain by 2021. That will be one of the biggest landmarks in technology.

    The data center TCO that is the annual total cost of ownership using prodigy reduces by 4x. All this happens because of Tachyum’s disruptive processor design embedded with a smart compiler that ultimately expels a number of parts of the hardware found in today’s typical processor as redundant. Ultimately it’s going to be fewer transistors, simpler core, fewer and shorter wires, greater speed, and huge power efficiency for the Prodigy processor. Basically, it is now businesses to decide when do they adopt Prodigy from Tachyum. Now to make things simpler or later when things complicate to a large extent.

    Dr. Radoslav Danilak, CEO of Tachyum says, “Much has been made about the death of Moore’s Law and how the possibility to improve density, power efficiency and cost benefits in the semiconductor industry into the future is problematic. But the advances seen among PCI shows that this simply isn’t true. PCI-SIG already announced PCIe 6.0, which would double bandwidth again within 3 years. We, at Tachyum, will also ensure that the processor will not be the limiting factor either; it just requires a more innovative approach. We are glad to join the efforts of our fellow members in PCI-SIG in improving the speed capabilities of the next PCIe standard to support the performance needs of the data center, AI and Big Data workloads.”


    August 16, 2019  10:36 PM

    Works with SwiftStack Simplifies Cloud Storage Deployments @SwiftStack

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud storage, SwiftStack

    SwiftStack’s new technology partner program along with ‘Works with SwiftStack is all about validating integration with modern applications that use S3 API from Amazon. This seamless integration is for flexibility, scalability, and simplicity of cloud storage that includes public cloud platforms like Amazon S3 and Google cloud storage as well as SwiftStack’s private and multi-cloud storage. So basically SwiftStack launches ‘Works with SwiftStack’ and Technology partner programs in order to simplify cloud storage deployments for its customers. As we all know SwiftStack is the leader in multi-cloud data storage and management. Its latest launch is to provide customers confidence that their validated integrations have gone through exhaustive testing just to ensure enhanced compatibility with SwiftStack’s object-based cloud storage solution. SwiftStack’s state of the art storage and multi-cloud data management software works well with modern applications using Amazon S3 API as well as open standard Swift API.

    As a matter of fact, for a large number of its customers, many commercial software applications have been created or recreated to take advantage of these API in order to experience the flexibility, scalability, and simplicity of cloud storage. It includes both public cloud platforms like Amazon S3 and Google cloud storage as well as SwiftStack’s private and multi-cloud storage. The API and documentation of this API are easily accessible. These are helping software developers to validate their implementations for ensuring enhanced functionality and utmost user experience. Basically SwiftStack Technology partner program and the Works with SwiftStack feature, both empower software developers and relevant vendors with a unique method to not only validate their software functionality with SwiftStack but also to ensure the expansion of a business relationship for the mutual benefit of the vendor, SwiftStack, and end-users of the joint solution.

    Works With SwiftStack

    There are multiple benefits of joining the SwiftStack Technology Partner program which includes features like expanding customer reach with the help of integration with the industries needing multi-cloud storage and data management platform. Their partners are able to achieve validation with the help of an easy to follow self-test plan this gaining remarkable results. Actually, the partners have access to a proprietary set of SwiftStack solutions and a dedicated engineer to get the best of the value of the joint offering. Works with SwiftStack application is helpful for customers having multiple cloud vendors who can jointly work to resolve any kind of unexpected issues. The same application also helps customers developing their own in house applications to use the program to check and ensure if the functionality of their implementation is correct or not.

    Cloud Storage Works With SwiftStack

    Source: SwiftStack.com

    As a matter of fact, more than 30 applications have already been verified as part of the Works with SwiftStack program. The complete list of vetted partners can be accessed here. Technology partners who are interested in joining the SwiftStack Technology Partner program can get complete information in this regard here. SwiftStack was founded in 2011 by a group of experts in cloud computing. As a matter of fact, SwiftStack, in a short span of 8 years, has become a leading cloud storage provider. Organizations requiring universal access to petabytes of unstructured data in a single namespace find SwiftStack as the best solution. SwiftStack software is highly in demand in many business verticals like global service providers, life sciences, web-based businesses, and media and entertainment.

    Works With SwiftStack

    SwiftStack suits well with industries working in fields like artificial intelligence, machine learning, analytics, Active Archive, scientific research, and those who need to manage data across multiple clouds. SwiftStack customers include industry leaders like eBay, Verizon, HudsonAlpha Institute for Biotechnology and PAC-12 networks.


    August 14, 2019  9:13 AM

    @EndemolShineUS Banks On @Nexsan Unity Unified Storage

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Nexsan, Unified storage

    This is what I call as a real-life approval of all your technological advancements when a company as large as Endemol Shine North America deploys Nexsan Unity unified storage solution. This is recently announced by Nexsan that Nexsan Unity now serves as the backbone of Endemol Shine North America’s file-based workflow thus empowering its complete editorial mechanism. Endemol Shine North America is one of the pioneers in delivering global-standard content and exemplary storytelling. It basically serves multiple platforms in the U.S. and across a large number of countries worldwide. Endemol Shine Groups creates global content, produces, and distributes and has diversified in various industry verticals. It is, in fact, known for a number of hit television programs and series across the globe. So, when such a company with a top-level presence across the globe adopts a technology solution for its unified storage, it has to be the best-in-class.

    If you notice, file-based workflow editorial and editing processes are the key elements for a business like Endemol Shine North America. And now, this is being taken care of by Nexsan Unity, unified storage solution. Nexsan®, in fact, is a part of the StorCentric® group. It is, as I said above, a global leader in unified storage solutions that includes archive and high-volume storage. As a matter of fact, Endemol Shine is already using Nexsan E-series high-density storage arrays that itself is a best in class product. To further enhance its capabilities, Endemol Shine empowers it with Nexsan Unity™ unified storage solution which now takes care of Endemol Shine’s entire editorial process.

    Unified Storage mastered by Nexsan Unity

    Globally hit shows like Big Brother (CBS), The Real Housewives of Atlanta (Bravo), MasterChef & MasterChef Junior (FOX), Extreme Makeover: Home Edition (HGTV), and The Biggest Loser (USA) are all products of Endemol group.

    Unified Storage

    Source: Nexsan.com

    As Alex Palatnick, Vice President, Post Production, Endemol Shine North America says, “Just a few years ago, it was standard to shoot a conventional television show on videotape and work in proxy resolution because editing at high resolution was prohibitive. This offline and then online editing process to conform to each episode was labor-intensive and required multiple assistant editors over two shifts along with expensive broadcast decks. And that was just for one episode. We wanted to overcome these challenges, and instead design a workflow that would allow every edit client to be able to access the media to transcode it for editorial. At times, this would mean as many as 150 simultaneous editors. We designed a new file-based workflow that would require top-notch stable equipment to support it. That’s where Nexsan came in.”

    Palatnick continues, “Today, Nexsan houses our camera masters. When we go into production, and we are doing acquisition – shooting a television show, all of that content lives full-time, until the show wraps, on Nexsan. Our storage must always be in perfect operating condition, and it must be flexible. It must be able to expand and support continuous, planned and unplanned content creation. And, it must enable us to protect, find and retrieve any content, at any time – and that is exactly what Nexsan has provided.”

    Unified Storage has a new benchmark with Nexsan Unity

    It is not that it was a cakewalk for Nexsan. In fact, Endemol Shine North America considered most of the other solutions in the fray meant for media and entertainment (M&E) vertical, but none matched that could meet all of its criteria other than Nexsan. Mihir Shah, CEO of StorCentric, the parent company of Nexsan says, “Digitization has become the paradigm in the M&E industry. As new techniques and technologies continue to enter the space, driven by leading-edge innovators such as Endemol Shine North America, data storage requirements will continue to skyrocket. Nexsan’s storage portfolio is able to uniquely support the rigorous requirements of the M&E industry, at a price point that ensures unparalleled ROI.”


    August 8, 2019  12:06 AM

    State of Critical Application Availability In Cloud @SIOSTech

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    High Availability

    As a head of technology of their respective organizations, how many CIOs or CTOs would be actually aware of the state of critical business application availability in cloud and hybrid cloud environments? I doubt if more than 20% of them will have a real-time alert mechanism as and when the situation becomes red or orange from the green. Otherwise, mostly it is the end-user that acts as an alarm to get the IT department in action in such cases. If that is the case, that is not at all a healthy situation. Many CIOs or even businesses live under this misconception that just because their application is in the cloud meaning they have high availability. As a matter of fact, I have seen many enterprises migrating to cloud with the sole conception in mind that they are moving to cloud means high availability their business applications.

    Merely migrating or residing in the cloud is not the synonym to high availability. Because in most of the cases standard cloud offerings don’t include high availability. In fact, that is where your roles come in deciding the best out of many options available depending on your business needs. But logically when it is about mission-critical applications, it means effectively no downtime thus limiting a CIO to have a good number of choices to opt from. Jerry Melnick, president, and CEO of SIOS Technology gives some very valuable insights about how a business can ensure high availability for critical applications in the cloud. Basically, for a business, it is very important to understand the promises and myths of High Availability (HA) in the cloud. While selecting your cloud vendor, as a CIO you must be very clear about the HA options and solutions available in the cloud.

    State of High Availability

    As Jerry Melnick says,

    “As IT looks to move their most business-critical applications from the data center to cloud and reap the benefits, they are confronted with a seemingly vast set of choices on how to assure availability and data protection during cloud outages. Sifting through the capabilities and promises the cloud providers market and understanding how they can use these with other technologies to achieve the required SLA’s is a daunting task for even the most seasoned IT veterans.”

    As a matter of fact, the technology arm of a business must be very clear about the capabilities and shortcomings of its cloud vendor. Foremost to understand is in what ways do the cloud facilities are going to address your HA requirements. You must be aware of the gaps while formulating your comprehensive strategy with an aim to cover critical failures of components and timely and reliable recovery.

    Going a little deeper in a technology design of an enterprise-critical application, it must be very clear what do application-specific solutions such as SQL Server availability groups, and cloud vendors are providing. As Jerry says,

    “Availability is about reliably managing redundancy of all components of the application and infrastructure stack and recovering effectively.  Achieving reliable fault detection, reliable fault recovery, and covering the full scope of errors needed to be detected are essential for handling service outages. The cloud provides many facilities and resources that could theoretically be assembled to do all this. However, the practicality of designing, creating, testing, and maintaining a solution to accurately perform when you need it most is beyond most IT shops.”

    If you are clear about the challenges of creating a reliable availability and DR strategy from cloud components and homegrown designs, probably you are driving in the right direction.

    High Availability Strategy

    We must very clearly understand the key requirements for a highly effective HA/DR strategy includes the scope of coverage, a good provision of anticipating failures well in advance, and reliability of responding to those. As a matter of fact, challenges of developing custom designs from the cloud and on-premise environments are not easy to understand and tackle looking at the complexity and scope of the problem needing a quick resolution to support the objectives of a critical application with most appropriate SLAs. The primary goal of an IT solution architect would be to reliably protect business-critical applications while assessing commercially developed and market-tested solutions for HA/DR. HA/DR solutions by SIOS have been developed and enhanced over the years by best in the class industry experts who minutely understand the key issues needing immediate attention. Solutions like this are tested and verified thus ensuring reliable operations in any situation.

    High Availability

    Basically, the key parameters to assess a great HA/DR solution are its capabilities to manage operations in a wide range of environments, workloads, and configurations. It is, primarily, important to restrict the variety and variations of technologies in an enterprise environment to ascertain the right kind of approaches to achieve HA/DR across the organization. This will, definitely require expert training and support for each approach that most of the organizations either fail to understand or ignore. In fact, multiple approaches would need development and execution of a number of methodologies to test and validate under various real-time situations like a wide variety of environments, multiple operation variables, and different use cases. Obviously, the implementation effort will increase with the existence of different technologies and approaches employed to HA/DR. So will the complexities. That is why certifications by AWS and Microsoft are essential for the complete technology stack.

    High Availability or HA and DR go hand in hand

    Looking at all these realities and possibilities, Jerry Melnick concludes saying,

    “Traditional, commercial solutions such as HA clustering, have evolved and matured over many years. Their design uniquely addresses downtime by covering the full stack detecting and recovering from errors in the application, storage, network, or any level of the infrastructure layer. These capabilities give IT the flexibility to configure HA and DR systems using the cloud, cloud regions and zones, as well as physical and virtual data center resources, to reliably deliver the SLAs they need – so customers get their services and IT can sleep well at night.”


    August 6, 2019  11:37 PM

    PCIe Gen4 Storage Empowered with a Portfolio of Products From Phison

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    NVMe, PCIe

    Phison has become a synonym with its industry-first products that prove its leadership in the global market. This time it is about hitting the PCIe Gen4 Storage Market with a portfolio of products. As a matter of fact, Phison is the first and only company shipping PCIe® Gen4x4 NVMe SSD solutions. That makes it the industry leader that has enabled high-performance computing for high-speed, high-volume, bandwidth-hungry applications catering to millions of transactions and volumes of data movements. Phison has become a landmark in itself and a benchmark for others in developing these solutions. The solutions that it is providing are raising the bar to a new level of applications expectations to meet the requirement of faster and higher-definition digital transactions. In fact, these expectations are increasing exponentially with the faster adoption of newer technologies like big data, Internet of Things, Machine Learning, Artificial Intelligence, and Virtual/Augmented Reality.

    Very few of us know that the PCIe 4.0 standard has double the data transfer rate than its predecessor PCIe 3.0. This doubled data transfer rate along with signal reliability and integrity of PCIe Gen4 empowers technology solutions providers with the capabilities of delivering higher performance, enhanced flexibility, and decreased latency to a huge volume of applications that includes PC, mobile, gaming, networking, and storage.

    Chris Kilburn, corporate vice president, and general manager, Client Channel, AMD says, “There is continued pressure in the industry to improve the performance of computing systems to support the applications that end users are most interested in. PCIe 4.0 offers manufacturers a way to meet these consumer demands. AMD is delighted to work with Phison to raise the bar by introducing first-to-market solutions. Through sound engineering and design, we are working together to deliver the experiences our customers demand.”

    PCIe Gen4 Storage

    Sumit Puri CEO and Co-Founder of Liqid says, “PCIe Gen4 will unleash the performance capabilities required for next-generation data-centric applications, including artificial intelligence and 5G edge computing. The LQD4500 provides 32TB of capacity and a PCIe Gen4x16 interface that enables over 24GB/s of throughput and 4 million IOPS. This impressive performance is only possible by aggregating multiple Phison E16 NVMe controllers into a single device. The Phison E16 provides industry-leading performance, capacity and NVMe features required to build the PCIe Gen4 enabled data center of the future.  Liqid is excited that the Phison E16 is now powering the fastest storage in the world, the LQD4500.”

    PCIe

    Phison’s stronghold in-memory technology, innovation in flash memory products, and excellent in engineering are the key factors to establish it as a market leader known for its first-to-market expertise. While most of its peers in the market are yet to debut in Gen4 solutions, Phison has achieved the development of a package of products to cater to multiple sockets within the consumer space. The range of new portfolio of PCIe Gen4x4 NVMe products will be released within a year’s timeframe that includes PS5016-E16, PS5019-E19T, and PS5018-E18. Their availability will happen in the same order.

    PCIe

    K.S. Pua, CEO of Phison Electronics says, “After several years since the announcement of the standard, the era of PCIe 4.0 solutions is upon us and Phison is at the forefront of this movement with our portfolio of Gen4x4 solutions. We pride ourselves with our long history of innovation supporting emerging technologies. From doubling transfer rates to improving power consumption to increasing performance, Phison-based SSD solutions allow our integration partners to deliver the next-generation PC, gaming and storage systems needed to satisfy increasing consumer demand.”

    Phison is showcasing its products at the Flash Memory Summit (FMS), August 6-8 in Booth No. 219 at the Santa Clara Convention Center in Santa Clara, California.


    July 31, 2019  11:49 PM

    Designing Data-Intensive Applications @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

    Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems 1st Edition, Kindle Edition by Martin Kleppmann

    Excerpt as on Amazon.com

    Data is at the center of many challenges in system design today. Difficult issues need to be figured out, such as scalability, consistency, reliability, efficiency, and maintainability. In addition, we have an overwhelming variety of tools, including relational databases, NoSQL datastores, stream or batch processors, and message brokers. What are the right choices for your application? How do you make sense of all these buzzwords?

    In this practical and comprehensive guide, author Martin Kleppmann helps you navigate this diverse landscape by examining the pros and cons of various technologies for processing and storing data. The software keeps changing, but the fundamental principles remain the same. With this book, software engineers and architects will learn how to apply those ideas in practice, and how to make full use of data in modern applications.

    Peer under the hood of the systems you already use and learn how to use and operate them more effectively
    Make informed decisions by identifying the strengths and weaknesses of different tools
    Navigate the trade-offs around consistency, scalability, fault tolerance, and complexity
    Understand the distributed systems research upon which modern databases are built
    Peek behind the scenes of major online services, and learn from their architectures

    From the Publisher:

    Who Should Read This Book?
    If you develop applications that have some kind of server/backend for storing or processing data, and your applications use the internet (e.g., web applications, mobile apps, or internet-connected sensors), then this book is for you.

    This book is for software engineers, software architects, and technical managers who love to code. It is especially relevant if you need to make decisions about the architecture of the systems you work on—for example if you need to choose tools for solving a given problem and figure out how best to apply them. But even if you have no choice over your tools, this book will help you better understand their strengths and weaknesses.

    You should have some experience building web-based applications or network services, and you should be familiar with relational databases and SQL. Any non-relational databases and other data-related tools you know are a bonus, but not required. A general understanding of common network protocols like TCP and HTTP is helpful. Your choice of programming language or framework makes no difference for this book.

    If any of the following are true for you, you’ll find this book valuable:
    You want to learn how to make data systems scalable, for example, to support web or mobile apps with millions of users.
    You need to make applications highly available (minimizing downtime) and operationally robust.
    You are looking for ways of making systems easier to maintain in the long run, even as they grow and as requirements and technologies change.
    You have a natural curiosity for the way things work and want to know what goes on inside major websites and online services. This book breaks down the internals of various databases and data processing systems, and it’s great fun to explore the bright thinking that went into their design.

    Sometimes, when discussing scalable data systems, people make comments along the lines of, ‘You’re not Google or Amazon. Stop worrying about scale and just use a relational database’. There is truth in that statement: building for scale that you don’t need is wasted effort and may lock you into an inflexible design. In effect, it is a form of premature optimization. However, it’s also important to choose the right tool for the job, and different technologies each have their own strengths and weaknesses. As we shall see, relational databases are important but not the final word on dealing with data.

    Scope of This Book
    This book does not attempt to give detailed instructions on how to install or use specific software packages or APIs since there is already plenty of documentation for those things. Instead, we discuss the various principles and trade-offs that are fundamental to data systems, and we explore the different design decisions taken by different products.

    We look primarily at the architecture of data systems and the ways they are integrated into data-intensive applications. This book doesn’t have space to cover deployment, operations, security, management, and other areas—those are complex and important topics, and we wouldn’t do them justice by making them superficial side notes in this book. They deserve books of their own.

    Many of the technologies described in this book fall within the realm of the Big Data buzzword. However, the term ‘Big Data’ is so overused and underdefined that it is not useful in a serious engineering discussion. This book uses less ambiguous terms, such as single-node versus distributed systems, or online/interactive versus offline/batch processing systems.

    This book has a bias toward free and open-source software (FOSS) because reading, modifying, and executing source code is a great way to understand how something works in detail. Open platforms also reduce the risk of vendor lock-in. However, where appropriate, we also discuss proprietary software (closed-source software, software as a service, or companies’ in-house software that is only described in the literature but not released publicly).


    July 31, 2019  11:41 PM

    NAND Flash Memory Technologies @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

    NAND Flash Memory Technologies (IEEE Press Series on Microelectronic Systems) 1st Edition, Kindle Edition by Seiichi Aritome (Author)

    Offers a comprehensive overview of NAND flash memories, with insights into NAND history, technology, challenges, evolutions, and perspectives

    Describes new program disturb issues, data retention, power consumption, and possible solutions for the challenges of 3D NAND flash memory

    Written by an authority in NAND flash memory technology, with over 25 years’ experience

    From the Back Cover
    Examines the history, basic structure, and processes of NAND flash memory

    This book discusses basic and advanced NAND flash memory technologies, including the principle of NAND flash, memory cell technologies, multi-bits cell technologies, scaling challenges of the memory cell, reliability, and 3-dimensional cell as the future technology. Chapter 1 describes the background and early history of NAND flash. The basic device structures and operations are described in Chapter 2. Next, the author discusses the memory cell technologies focused on scaling in Chapter 3 and introduces the advanced operations for multi-level cells in Chapter 4. The physical limitations for scaling are examined in Chapter 5, and Chapter 6 describes the reliability of NAND flash memory. Chapter 7 examines 3-dimensional (3D) NAND flash memory cells and discusses the pros and cons in structure, process, operations, scalability, and performance. In Chapter 8, challenges of 3D NAND flash memory are discussed. Finally, in Chapter 9, the author summarizes and describes the prospect of technologies and market for the future NAND flash memory.

    Offers a comprehensive overview of NAND flash memories, with insights into NAND history, technology, challenges, evolutions, and perspectives
    Describes new program disturb issues, data retention, power consumption, and possible solutions for the challenges of 3D NAND flash memory
    Written by an authority in NAND flash memory technology, with over 25 years’ experience
    NAND Flash Memory Technologies is a reference for engineers, researchers, and designers who are engaged in the development of NAND flash memory or SSD (Solid State Disk) and flash memory systems.

    About the Author
    Seiichi Aritome was a Senior Research Fellow at SK Hynix Inc. in Icheon, Korea from 2009 to 2014. He has contributed to NAND flash memory technologies for over 27 years in several companies and nations. Aritome was a Program Director at Powerchip Semiconductor Corp. in Hsinchu, Taiwan, a Senior Process Reliability Engineer at Micron Technology Inc. in Idaho, USA, and a Chief Specialist at Toshiba Corporation in Kawasaki, Japan. He received his Ph.D. from Graduate School of Advanced Sciences of Matter, Hiroshima University, Japan. Aritome is an IEEE Fellow and a member of the IEEE Electron Device Society.


    July 31, 2019  11:01 PM

    Phison Showcases Technology Innovation Leadership at FMS2019

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Flash memory, NVMe, PCIe, PCIe SSD, SSD

    If you are there at Flash Memory Summit which is from August 6 to August 8 at the Santa Clara Convention Centre in Santa Clara, California, then don’t forget to visit Booth number 219 because you are going to witness one of the most innovative technologies there. Phison is one of the pioneer company that is giving the best of the class SSD solutions. In fact, Phison is the only company with PCIe Gen4x4 NVMe SSD solutions at Flash Memory Summit. There will be a lot of partner demos and interesting panel participation at the summit. Phison brings Technology innovation leadership on full display at Flash Memory Summit 2019. Phison electronics is the industry leader in the flash controller and NAND solutions. It will showcase its lineup of PCIe Gen4 SSD solutions that include the public debut of its power-conscious PS5019-E19T controller at both number 219.

    Phison

    As a matter of fact, it is the first and only company ready with PCIe Gen4x4 NVMe SSD solutions. There will be a demonstration of how its controllers push the boundaries of low power consumption and high performance for storage. In fact, it is being shown publicly for the first time how Phison’s E19T controller that offers low power consumption for main steam drives and at the same time promises to deliver best in class power savings while reducing cooling needs in data centers. That is, of course, a phenomenal achievement. In addition to that Phison is offering a preview of the company’s next-generation gen 4 by 4 PS 501 851118 controller this is an optimized design that it gives high-performance advantages because of PCI e 4.0 interface. This enables the company to how to mark a new achievement it in performance leadership in gen 4 SSD.

    Phison Elevates Scale in SSD Solutions

    That is not all the company will also showcase for the first time its PS5013-E13T 1113 BGA SSD at FMS. With this, the customer gets all kind of advantages of flash technologies in ultra-thin and ultra-compact 1113 BGA form factor. The E13T BGA SSD can perform up to 1.7 GB per second sequential read and 1.1 GB per second sequential write while consuming only 1.5 watts. That ensures a prolonged battery life of any embedded solution. At the same booth, there will be more demonstrations from Phison’s Technology partners Liqid and Cigent Technology Inc. Liqid will be showcasing its ultra-high-performance Gen4 NVMe full length and full height add-in card model LQD4500 powered by Phison’s E16 controller. It is capable of 5 million IOPS and 24 GB per second throughput. The card is available up to 32 TB of capacity.

    On the other hand, Cigent will demonstrate it’s Dynamic Data Defense Engine (D3E™) for Windows 10. D3 when paired with Phison’s E12-based SSD based helps to prevent the exfiltration of sensitive data as soon as a system gets compromised. As a matter of fact, s Phison E12 allows D3E to support “on the fly” firmware based folder locking which can only be allowed to access with a higher level authentication the moment its threat level is elevated.

    Phison Electronics

    K.S. Pua, CEO, Phison Electronics says, “Whether in the audience at one of our speaker presentations or stopping by our booth for a demonstration of our next-generation technologies, FMS attendees will have an excellent opportunity to learn how Phison is leading the way in delivering high-performance solutions that meet the ever-increasing needs of the data storage market. FMS is the ideal setting for us to demonstrate this leadership, as well as the perfect venue to publicly show our E19 for the first time.  We look forward to a great show.”


    July 30, 2019  11:33 PM

    Storage Economics Sets A New Benchmark with S1aaS @StorOne_Inc

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Backup storage, Data storage software, Dell, FPGA, Mellanox, Storage

    Storage Economics gets a new paradigm shift soon after the announcement of S1-as-a-Service (S1aaS) from StorOne. What is means is a comprehensive enterprise storage solution is now feasible at a very cost-effective subscription model that never existed so far. Neither had it been imagined or initiated by any other venture so far. StorOne’s innovation behind the S1 storage software platform brings S1aaS. S1aaS, in simple terms, can be explained as a use-based solution integration enterprise-class S1 storage service with Dell Technologies and Mellanox hardware. This integrated solution showcases the next level of storage economics where the industry can define its price point. This transformation delivers a perfect balance between enterprise-level performance and data protection capabilities. It also provides the best answer to a long-pending question hovering around without a solution. This refers to proper resource utilization that StorOne has overcome beautifully balancing security risks and performance impacts.

    Storage Economics

    These security risks and performance impacts are usually deeply connected with the cloud that on one hand provides the reliability and capabilities as good as that of an on-premise model and on the other hand guarantee you only pay for what you need to utilize to support your business requirements. And, in fact, all this happens flawlessly and in a very transparent manner. This is one of the best prepositions for the customer who gets the best of both worlds. On one hand, the customer enjoys cloud-like simplicity and on the other hand the flexibility in the pricing in such a manner as listed above. That is the environment remains as that of a cloud-based model and the performance and control is like an on-premise infrastructure.

    Storage Economics Achieves a New Landmark

    The pricing of this S1aaS model starts at $999 per month for an 18 terabyte (TB) of an all-flash array that performs up to 150,000 IOPS. This, as a matter of fact, is the most flexible model with customer definable pricing along with best of the capacity and performance capabilities available in the market.

    Gal Naor, CEO, and co-founder of StorONE says, “S1aaS is going to change the economics not only of storage but of the entire data center. S1aaS makes enterprise-class all-flash array performance and data protection and control available for only $999 per month. No other vendor can offer a complete storage solution – whether on-premises or in the cloud – for this low of a monthly cost.”

    George Crump, Founder, and Lead Analyst, Storage Switzerland says, “As high-performance, SAS and NVMe flash drives become commonplace in the data center, storage media is no longer the bottleneck to performance. The storage management layer is a problem. Vendors try to compensate by using more powerful processors, more RAM, custom FPGAs, and ASICs, as well as spreading I/O across dozens of flash drives, whose capacity is not needed. StorONE’s focus on efficiency – 150K IOPS from four conventional drives, an industry-defining capability – is the foundational component of S1aaS. It enables the democratization of storage performance previously unavailable to the data center.”

    Motti Beck, Senior Director Enterprise Market Development, Mellanox says “Advanced storage solutions like this require high-performance, programmable and intelligent networks. The combination of StorONE’s S1 software and Mellanox Ethernet Storage Fabric solutions eliminate the traditional bottlenecks that have been associated with the server to storage communication and supports critical storage features, which improves data center efficiency and ensures the best user experience available.”

    Further information about S1aaS is available at http://bit.ly/2JWBb56


    July 24, 2019  11:44 PM

    NVIDIA Partner Network Strengthens With the entry of @SwiftStack @NVIDIA

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Artificial intelligence, Cloud data storage, Machine learning, NVIDIA, Public Cloud, SwiftStack

    According to the latest release, Swiftstack joins NVIDIA Partner Network. It means a lot to Major industries like automobile, healthcare, and telecom to name a few. Now, Autonomous Vehicles, Telecom, and Healthcare can leverage large-scale artificial intelligence and machine learning data pipelines from edge to core to cloud. That covers a complete spectrum, in fact. SwiftStack, as we all know, is the market leader in multi-cloud data storage and management. The company announces its entry to NVIDIA Partner Network (NPN) program as a key solution provider for latest technologies like artificial intelligence and machine learning use cases. SwiftStack uses NVIDIA DGX-1 systems and NGC container registry of GPU-optimized software in its latest state-of-the-art storage solution. The solution covers large-scale Artificial Intelligence (AI) and Machine Learning (ML) along with deep learning workflows that span across edge-to-core-to-cloud data pipelines for the use cases mentioned above.

    NVIDIA Partner Network

    As a matter of fact, the NPN Solution Advisor Program empowers NVIDIA customers with full access to world-class solution experts having deep knowledge of enterprise-wide integration with NVIDIA DGX-1 clusters. To add further value to it, SwiftStack’s AI/ML solution has the power to deliver massive storage parallelism and throughput to NVIDIA GPU compute and NGC. The use cases cover a wide range including data ingest, training and inferencing, data services, etc to support any kind of AI/ML workflows. On top of it, the SwiftStack 1space solution extends to the public cloud so that the customers can benefit from cloud-bursting and economies of scale. At the same time, the data stays secured on-premises.

    SwiftStack joins NVIDIA Partner Network

    Amita Potnis, Research Director at IDC’s Infrastructure Systems, Platforms and Technologies Group says, “ Infrastructure challenges are the primary inhibitor for broader adoption of AI/ML workflows. SwiftStack’s multi-cloud data management solution is the first of its kind in the industry and effectively handles storage I/O challenges faced by the edge to core to cloud, large-scale AI/ML data pipelines.”

    NVIDIA Partner Network

    Shailesh Manjrekar, Head of AI/ML Solutions Marketing and Corporate Development at SwiftStack says, “ The SwiftStack solution accelerates data pipelines, eliminates storage silos, and enables multi-cloud workflows, thus delivering faster business outcomes. Joining NVIDIA’s Partner Network program builds upon the success we are seeing with large-scale AI/ML data pipeline customers and endorses our value to these environments.”

    Craig Weinstein, Vice President, Americas Partner Organization at NVIDIA says, “ NVIDIA AI solutions are used across transportation, healthcare and telecommunication industries. Our high-performance computing platform needs fast storage and SwiftStack brings on-premises, scale-out, and geographically distributed storage that makes them a good fit for our NPN Solution Advisor Program.”


    July 21, 2019  11:33 PM

    Protect Your Physical, Virtual, and Cloud With NAKIVO v9 @Nakivo

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data Recovery, Deduplication, Nakivo, VM backup

    With the release of NAKIVO v9, your support for physical windows server backup is here. As a matter of fact, NAKIVO Backup & Replication v9 now provides 100% protection for physical, virtual, and cloud environments. NAKIVO Inc. has proven its mettle in a very short span. It is one of the fastest-growing software company with a sole aim to provide enterprise solutions to protect virtual and cloud environments. With this announcement, NAKIVO creates a new landmark in physical, virtual, and cloud environments. The new version, i.e. v9, adds support for Microsoft Windows Server backup thereby empowering its customers to safeguard physical, virtual, and cloud environments from a single point. That is an extremely useful feature for an enterprise from system upkeep point of view. There are certain key features of the new release. For instance, it supports the application-consistent Backup.

    What we mean by Application-consistent Backup is that the new release NAKIVO Backup & Replication v9 can very well take care of incremental, application-aware backups of physical Microsoft Windows Servers. That means the solution now provides application-consistent backups of business-critical applications including databases running on physical Windows Servers. This includes different variants of Microsoft Servers viz Microsoft Exchange, SQL, Active Directory, and Sharepoint, and also Oracle. Another key feature is the Global Data Deduplication. Now, the backups of physical servers can be stored in a regular backup repository. This can easily be done along with backups of VMs and AWS EC2 instances. All these backups being stored in a backup repository can, in turn, be automatically deduplicated irrespective of the platform. This ensures only unique data blocks being saved. This results in a tremendous saving of storage space used by physical machine backups.

    NAKIVO Backup & Replication v9 also means Instant Granular Recovery. This means now you can instantly recover files, folders, or Microsoft application objects (any Microsoft server) directly from the earlier created deduplicated physical machine backups. In addition, customers can take help from the Universal Application Object Recovery feature when an instant recovery of objects from any other applications is required. The new version supports Physical to Virtual (P2V). Besides instantly recovering files, folders, and objects from physical servers backups, enterprises can also restore physical Windows Server backups to VMWare and Hyper-V VMs. While the new version tackles so many complexities, the pricing model is quite simple. It starts at $17 per machine/year which is probably the most cost-effective per machine subscription model. A single per-machine license means either of a VMWare VM, Hyper-V VM, a physical machine, Nutanix AHV VM, or AWS EC2 instance.

    This provides customers a high level of flexibility and least dependence on different vendors. The customer can now easily remove vendor locks in order to move their workloads between various platforms without any need of changing their data protection licensing. Bruce Talley, CEO, NAKIVO Inc. says, “NAKIVO Backup & Replication v9 enables our customers to not only protect their business-critical workloads across virtual and cloud environments but now also physical Windows Server systems. Now our customers who have physical or mixed environments can protect their critical business data from a single pane of glass.”

    RESOURCES
    Trial Download: www.nakivo.com/resources/download/trial-download/
    Success Stories: www.nakivo.com/customers/success-stories/
    Datasheet: www.nakivo.com/res/files/nakivo-backup-replication-datasheet.pdf


    July 14, 2019  12:10 AM

    1st Data Orchestration Platform with Multi-cloud Analytics AI @Alluxio

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud analytics, data orchestration

    Alluxio is a well-known name among the world’s top internet companies. In fact, 7 out of the top 10 internet companies use open-source data orchestration technology developed by Alluxio. The launch of Alluxio 2.0 at the recent AWS Summit in New York announces a lot more power in it. The open-source and enterprise edition Alluxio 2.0 simplifies and accelerates multi-cloud, data-hungry workload adoption and deployment. As a matter of fact, Alluxio 2.0, as a result, brings breakthrough innovations for data engineers who are responsible for managing and deploying analytical and AI workloads in the cloud.

    The solution works equally well in hybrid and multi-cloud environments. There is a tremendous demand in compute workloads across the globe. Adoption of the cloud has triggered this requirement multifold in an exponential manner. Organizations are adopting a decoupled architecture for modern workloads in which compute scales independently from storage. That brings in new data engineering problems.

    Data Orchestration

    The new paradigm definitely enables scaling elasticity. But at the same time, it scales up new data engineering problems. That arises a need for an abstraction layer with an immediate effect. The way compute and containers happen well with Kubernetes, data needs orchestration so badly with an increase in data silos. Data Orchestration will not only bring data locality but also enable data accessibility and data elasticity to compute across data silos. Those silos include different zones, regions, and even clouds. That is where Alluxio 2.0 Community Edition and Enterprise Edition comes in the picture. The two editions ensure new capabilities across all known critical segments that are causing gaps in today’s cloud data engineering market. Alluxio 2.0 is a true example of breakthrough data orchestration innovation for multi-cloud. It ensures policy-driven data management, improved administration, cross-cloud efficient data services, focussed compute, and integration.

    Data Orchestration

    Haoyuan Li, Founder and CTO, Alluxio says, “With a data orchestration platform in place, a data analyst or scientist can work under the assumption that the data will be readily accessible regardless of where the data resides or the characteristics of the storage. They can focus on building data-driven analytical and AI applications to create values, without worrying about the environment and vendor lock-in. These new advancements to Alluxio’s data orchestration platform further cement our commitment to a cloud-native, open-source approach to enabling applications to be compute, storage and cloud agnostic.”

    Mike Leone, Analyst, ESG says, “Data is only as useful as the insights derived from it and with organizations trying to analyze as much data as possible to gain a competitive edge, it’s challenging to find useful data that’s spread across globally-distributed silos. This data is being requested by various compute frameworks, as well as different types of users hoping to gain actionable insight. These multiple layers of complexity are driving the need for a solution to improve the process of making the most valuable data accessible to compute at the speed of innovation. Alluxio has identified an important missing piece that makes data more local and easily accessible to data-powered compute frameworks regardless of where the data resides or the characteristics of the underlying storage systems and clouds.”

    Data Orchestration

    Steven Mih, CEO, Alluxio says, “Whether by design or by departmental necessity, companies are facing an explosion of data that is spread across hybrid and multi-cloud environments. To maintain a competitive advantage, speed and depth of insight have become the requirement. Data-driven analytics that was once run over many hours, now need to be done in seconds. AI/ML models need to be trained against larger-and-larger datasets. This all points to the necessity of a data tier which orchestrates the movement and policy-driven access of a companies’ data, wherever it may be stored. Alluxio abstracts the storage and enables a self-service culture within today’s data-driven company.”

    Both Alluxio 2.0 Community and Enterprise Edition are now generally available for download via tarball, docker, brew, etc.

    Resources

    Alluxio 2.0 release page – https://www.alluxio.io/
    Download Alluxio 2.0 – https://www.alluxio.io/
    Founder blog – https://www.alluxio.io/blog/
    Product blog – https://www.alluxio.io/blog/2-


    July 13, 2019  11:12 PM

    Tachyum Inc’s 64-core processor cuts processor power by 10x @Tachyum

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Artificial intelligence, Data Center, processor

    Tachyum is a combination of two Greek words collectively symbolizing ‘an element of speed’. The company fits best to it. And it is continuously craving to beat its own records in terms of bringing a better product every time. Every time it brings a new product, that is first in the world. I think the top management of Tachyum is firm on keeping the company creating new landmarks for others to follow. The recent news is a good example of it. Tachyum is bringing $25 million in Series A financing. Rado Danilak, CEO, Tachyum has a couple of significant laurels to his credit. His last two companies were bought by Western Digital (WD) and LSI/SanDisk. He carries more than 100 patents to his credit. These patents are already in production. Rado has a deep knowledge of the semiconductor market in general.

    Tachyum

    Anybody dealing in chips and processors would be conversant with Prodigy Processor Universal Chip from Tachysum. It is the smallest and fasters general purpose, 64-core processor developed to date across the globe. The money mentioned is being used for further enhancement of the Prodigy Processor Universal Chip. It requires 10x less processor power than the chips produced by its nearest competitive products in the market from Intel, NVIDIA, and AMD. Another huge disruption it brings is a cost reduction of 3x. These two factors are more than enough to shake the existing markets and redefine the leadership chart in these segments. The development matches well with the AI revolution that demands machines more powerful than the human brain. The ultimate goal is to deliver AI for Good and AI for All. Prodigy has enormous strength. It reduces data center annual total cost of ownership (TCO) by 4x.

    Tachyum brings a new revolution in AI with Prodigy

    Prodigy from Tachyum is a sheer example of disruptive hardware architecture and a smart compiler. In fact, this new design has made many parts of the existing hardware in a typical processor redundant. The core has become simpler and smaller. The wires have become fewer and shorter. All this results in greater speed and power efficiency for the processor. That is what Prodigy Universal Processor Chip is. It is an ultra-low-power processor having a capability of enabling an Exaflop supercomputer using around 250,000 Prodigy processors.

    Tachyum

    Adrian Vycital, Managing Partner at IPM Group, Tachyum’s lead investor based in London and Bratislava says, “The work that Tachyum is doing is highly disruptive and will lead to dramatic improvements in burgeoning markets of artificial intelligence and high-performance computing that require extreme processing speeds and power efficiencies. Supporting Tachyum at this stage of their development provides cascading opportunities for unprecedented success, helping them to establish themselves as the leader in what truly is the future of computing.”

    Dr. Radoslav Danilak, Co-founder and CEO of Tachyum says, “We are extremely pleased to announce another infusion of working capital into Tachyum, which not only enables us to complete our mission of delivering disruptive products to market but also represents well-reasoned confidence in our approach to overcoming challenges faced by the industry. The ability to change the world takes more than one man’s vision. Having an investment community backing Tachyum allows us to properly build a world-class organization with the best and brightest talent available. We look forward to growing the company and the industry atop the foundation that we’ve already built.”

    You can visit the official website here: http://www.tachyum.com


    July 10, 2019  9:58 PM

    How Safe Is Your Enterprise Backup Data from Malware Attack? @asigra

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Asigra, Backup and restore, Backup Recovery and Media Services, Cloud Backup, Enterprise Backup, malware

    How many CIOs and CTOs can surely claim that their enterprise backup that is being taken regularly is not contaminated with any kind of Malware. A recent report published by DCIG features cybersecurity approaches from the legends in the field like Asigra, Rubrik, and Dell. Detecting and preventing malware in your enterprise backup environments is as critical as in the production environments. Unless and until you have sufficient knowledge and tools to detect those, it is impossible for you to respond and prevent your enterprise data well within safe limits. Asigra Inc. is a pioneer in cloud backup, recovery, and restore solutions since 1986. Just now it announced that the Data Center Infrastructure Group (DCIG) has come out with an important report titled “Creating a Secondary Perimeter to Detect Malware in Your Enterprise Backup Environment.” I think, the report is important to read by all CIOs and CTOs.

    enterprise backup

    source: asigra.com

    CTOs and CIOs must read this report in order to understand the threats and vulnerabilities in this regard they are living in. The report gives quite a number of useful insights. It presents a comparison of three approaches that any enterprise can use to detect and prevent malware attacks on backup data. The report further analyses which approach may be the most effective for enterprise backup environments. It is the purity of your backup sets that creates confidence in recovering lost data when the need arises. Therefore it is the foremost point of importance to understand that. As a matter of fact, enterprises are now understanding the importance of managing the threat that malware offers to backup data in today’s high-risk environments. A successful recovery completely depends on how pure a confirmed reproductive set of enterprise backup is. That set, in fact, should be completely free form malware.

    Enterprise Backup Data

    The three methodologies that the DCIG report talks about creating a golden copy of enterprise backup data are like this. First one talks about the inline scan which means all incoming and restored backup data must be actively scanned for malware in real-time. The second method recommends using a sandbox approach in which no scan happens while creating a backup set but an IT sandbox is set separately to recover this data and test it thoroughly for malware. Finally, the third method it talks about is Snapshot analysis in which snapshots of production data are captured and analyzed thoroughly. It is the result of the analysis will decide which set is infected with malware. Obviously, out of these three, the most appropriate method is the inline scan of backup and recovery data.

    Enterprise Backup

    Source: Asigra.com

    As DCIG states, “Inline scans represent the easiest and fastest way for a company to scan its backup data for the presence of known strains of malware as well as position the company to scan recovered data for yet unknown malware signatures.” That is where the Asigra enterprise backup solution comes in the picture as a top contender. The report suggests Asigra Cloud Backup V14 as an optimum solution for inline scanning of malware.

    Jerome Wendt, Founder, and President, DCIG says, “The products that Asigra, Dell EMC, and Rubrik offer, and the respective techniques they use to detect the presence of malware in backup repositories, represent the primary methodologies that backup software employs. Of these three, only Asigra and Rubrik provide a company with the means to automate and simplify the process to detect malware in backups. Of those two, only Asigra currently makes cybersecurity software available as an optional feature that a company can turn on.”

    Enterprise Backup Solution

    Eran Farajun, Executive Vice President, Asigra says, “ Asigra Cloud Backup V14 converges enterprise data protection and cybersecurity, embedding malware engines in the backup and recovery streams to prevent ransomware from impacting the business. Asigra identifies any infecting malware strains, quarantines them, then notifies the customer. It is a very comprehensive data protection solution, built from the ground up for distributed IT environments.”

    You can download the free DCIG report here: http://library.asigra.com/dcig-report


    July 10, 2019  5:17 PM

    @SwiftStack Enables @dcBLOXinc To Deliver Multi-Region Cloud Storage

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud storage, SwiftStack

    DC BLOX is a multi-tenant data center provider in Southeastern U.S. The company designs and manages highly secured & reliable data centers for almost all segments of clients viz government and education, enterprise, healthcare, content providers, life sciences, and managed service providers. That is a huge spectrum they serve to. You will mind many of their state-of-the-art data centers in traditionally underserved markets. This way they are able to provide affordable business-class cloud storage and colocation services along with a private high-performance network in the Southeast with an aim to guarantee business continuity The company also supports hybrid IT environments with the least upfront capital investment without a compromise with an iota of quality. Recently, DC BLOX selected SwiftStack to deliver a large scale multi-region cloud storage service for Southeastern U.S. That is a huge volume to cater to with a seamless service.

    SwiftStack is a market leader in multi-cloud storage and its management. DC BLOX decides to deploy SwiftStack software in order to boost its own multi-region, hyperscale cloud storage service for its large customer base for their business continuity (BC) and disaster recovery (DR) needs. As the business grows, it results in higher customer expectations. To meet those expectations, the service provider needs a foolproof stable system in place. DC BLOX was facing a tremendous demand for affordable secondary storage services from its existing customers. At the same time, it was having a high pressure to scale up its business level to expand to more regions. That resulted in an immediate requirement of a new storage platform that could seamlessly perform, scale-up, and cater to multiple geographic regions. There was also an intense requirement of support traditional and cloud-native applications.

    Seamless Multi-Region Cloud Storage

    Among many options of object and file storage from the leading vendor, DC BLOX found SwiftStack most suitable after a comprehensive evaluation process. Thus SwiftStack was given a nod to provide a turnkey platform for DX BLOX Cloud Storage. With the help of SwiftStack, DC BLOX is able to create a multi-region cluster that currently is at three locations with the fourth one coming shortly. And 15 more to come within a stipulated timeframe in a well-planned manner.

    Chris Gatch, CTO at DC BLOX says, “ SwiftStack helped us reduce the cost of storing and utilizing data, based on a comparison with other choices we considered, including the ability to manage more data with a smaller headcount. Along with savings at scale, we are able to offer innovative data services for a more compelling, more competitive solution.”

    Erik Pounds, Vice President of Marketing at SwiftStack says, “ DC BLOX offers a cloud solution that addresses the needs of the business communities they serve, and also has unique differentiators to let them compete with global public cloud providers. Giving its customers both object and file access to data ensures cloud storage is compatible with their users’ modern and legacy applications, which is a fairly unique feature compared to what is available from big cloud vendors.”


    July 8, 2019  7:27 PM

    Content Guru Creates A New Landmark at CCW Las Vegas @cgchirp

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Customer engagement, customer experience, Las Vegas, Rakuten

    Customer Contact Week (CCW) in Las Vegas this week becomes a major collaboration moment for two experts in their respective fields. One is Content Guru which is a global frontrunner in large volume cloud-based contact center technology. On the other hand is Rakuten Inc., the Japanese electronic commerce company. The two global leaders joined forces at CCW to demonstrate how effective intelligent automation is when it comes to delivering exemplary Customer Engagement and Experience. Of course, excelling in customer engagement would be a matter of concern for any organization across the globe. As a matter of fact, this is the 20th anniversary of CCW this year. CCW or Customer Contact Week that was from June 24 to June 28, 2019 at The Mirage Hotel in Las Vegas in world’s most significant conference and expo for Customer Experience (CX), contact center, and customer care professionals.

    Content Guru

    Content Guru was at stand #1102 at the event. The theme for Content Guru at the event was to showcase various major achievements for its customers in terms of customer experience. Rakuten is also known as the ‘Amazon of Japan’. The story of the relationship between Content Guru and Rakuten is quite interesting. Their journey together began at CCW Las Vegas in 2017. Rakuten highlighted how the company successfully used Content Guru’s cloud-based storm® platform that helped them transform their customer’s experience. After this deployment, the customers of Rakuten experience a greater experience when they contact the company or the thousands of sellers that use Rakuten platform to sell their products. In fact, Content Guru hosted a workshop on Tuesday, June 25 that was titled “Next Generation Omni-Channel Contact Center: AI, NLP, Web Chat & Chatbots”. The workshop was led by Martin Taylor, Deputy CEO, Content Guru.

    Content Guru Creates A New Landmark in Customer Experience

    By means of conducting this workshop, Content Guru showcased how they utilize intelligent automation and Artificial Intelligence through their state of the art storm® platform that helps their customers to deliver high-quality and ultimate customer service. Martin Taylor says, “Content Guru’s partnership with Rakuten originally began from conversations at CCW Las Vegas, so this event will always have a special place in our hearts. The quality of customer service has become a crucial differentiator between businesses. The CCW conference allows us to put front-and-center Content Guru’s vision to place organizations head-and-shoulders above their competition by providing the best Customer Engagement and Experience.”

    You might like to see how storm works. Have a look at the insightful video below:


    July 2, 2019  4:01 PM

    Why You Should Replace or Enhance Your Legacy VPN with a Software-Defined Perimeter (SDP) Solution

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    DH2i, SDP, VPN

    As has been the case for many decades, innovative new applications are entering the marketplace on a regular basis to support the ever-changing way we do business, interact with customers, and interact with each other. While developments in areas such as cloud, AI, machine learning, IoT, edge, mobile and big data to name just a few, bring with them undeniable and highly desirable benefits, they can also introduce problems. Certainly, one of the biggest pain points for many organizations is how to ensure the protection and security of data. To not do so can mean not only serious detriment to the long-term success of your business but can also carry serious legal and regulations compliance ramifications.

    Adding to the problem is that many IT professionals have come to rely upon and trust virtual private networks (VPNs) to deliver the level of security they require. And, this makes sense. For a very long time, they did indeed deliver the required security protection. Unfortunately, as it stands today VPNs have not evolved to support today’s application protection and security requirements. At least, not by themselves.

    I recently spoke with Don Boxley, CEO, and Co-Founder of DH2i on this subject. He describes VPNs as taking a “castle and moat” approach to security, where the VPN serves as the drawbridge. This painted a very understandable picture as to why VPNs are unable to meet today’s new business and IT realities. He explained that via this approach, organizations are more vulnerable to compromised devices and networks, excessive network access by non-privileged users, credential theft and other security issues. From a non-security specific standpoint, the VPN introduces complex manual set-up and maintenance, slow and unreliable connections, and an inability to scale efficiently and cost-effectively.

    We then talked about a relatively new approach to not necessarily replace a VPN (although, I would argue it could), but to dramatically enhance it – a software-defined perimeter (SDP) solution. SDPs offer an ideal new approach for connectivity security. SDP tackles legacy VPN, cloud-native and privileged user access security issues. Designed specifically to support today’s DevOps, IoT, containers, edge, and other workloads, with the inherent flexibility to be tailored to support future yet to be introduced application/workload requirements, SDP not only delivers considerably improved security but increased performance speed as well.

    DH2i has announced a new SDP solution, called DxConnect. DxConnect is a network security software designed to enable developers and network admins to build an integrated zero trust (ZT) connectivity security infrastructure for cloud-native applications, hybrid/multi-cloud connectivity and privileged user access without using a VPN. If you are interested in securing your organization’s data, and you wish to replace or enhance the capabilities of your VPN, you can learn more about DH2i’s new software here: http://dh2i.com/dxconnect/.


    June 30, 2019  2:35 PM

    Data center virtualization A Complete Guide – 2019 Edition @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data Center Virtualization

    Data center virtualization A Complete Guide – 2019 Edition by Gerardus Blokdyk

    Link: https://www.amazon.com/Data-center-virtualization-Complete-Guide-ebook/dp/B07T9QJSP7/ref=sr_1_22?keywords=virtualization&qid=1561884991&s=books&sr=1-22

    Excerpt from Amazon.com

    What is your risk of UPS failure? Should san and tape storage solutions be included in the asset inventory baseline? How does your organization further reduce operating costs at your data center? How to remote control, monitor and maintain the system? How much cooling infrastructure is necessary to cool the server environment adequately?

    Defining, designing, creating, and implementing a process to solve a challenge or meet an objective is the most valuable role… In EVERY group, company, organization and department.

    Unless you are talking a one-time, single-use project, there should be a process. Whether that process is managed and implemented by humans, AI, or a combination of the two, it needs to be designed by someone with a complex enough perspective to ask the right questions. Someone capable of asking the right questions and step back and say, ‘What are we really trying to accomplish here? And is there a different way to look at it?’

    This Self-Assessment empowers people to do just that – whether their title is entrepreneur, manager, consultant, (Vice-)President, CxO etc… – they are the people who rule the future. They are the person who asks the right questions to make Data center virtualization investments work better.

    This Data center virtualization All-Inclusive Self-Assessment enables You to be that person.

    All the tools you need to an in-depth Data center virtualization Self-Assessment. Featuring 933 new and updated case-based questions, organized into seven core areas of process design, this Self-Assessment will help you identify areas in which Data center virtualization improvements can be made.

    In using the questions you will be better able to:

    – diagnose Data center virtualization projects, initiatives, organizations, businesses and processes using accepted diagnostic standards and practices

    – implement evidence-based best practice strategies aligned with overall goals

    – integrate recent advances in Data center virtualization and process design strategies into practice according to best practice guidelines

    Using a Self-Assessment tool known as the Data center virtualization Scorecard, you will develop a clear picture of which Data center virtualization areas need attention.


    June 30, 2019  2:30 PM

    Server Virtualization A Complete Guide – 2019 Edition @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Virtualization

    Server Virtualization A Complete Guide – 2019 Edition by Gerardus Blokdyk

    Link: https://www.amazon.com/Server-Virtualization-Complete-Guide-2019/dp/0655805613/ref=sr_1_18?keywords=virtualization&qid=1561884991&s=books&sr=1-18

    Excerpt as on Amazon.com:

    What proportion of your data centers pursue energy savings through server virtualization? What proportion of data centers pursue energy savings through server virtualization? What about incompatibilities between two applications installed on the same instance of an operating system? what about mid-size companies? How will server virtualization technology evolve?

    Defining, designing, creating, and implementing a process to solve a challenge or meet an objective is the most valuable role… In EVERY group, company, organization and department.

    Unless you are talking a one-time, single-use project, there should be a process. Whether that process is managed and implemented by humans, AI, or a combination of the two, it needs to be designed by someone with a complex enough perspective to ask the right questions. Someone capable of asking the right questions and step back and say, ‘What are we really trying to accomplish here? And is there a different way to look at it?’

    This Self-Assessment empowers people to do just that – whether their title is entrepreneur, manager, consultant, (Vice-)President, CxO etc… – they are the people who rule the future. They are the person who asks the right questions to make Server Virtualization investments work better.

    This Server Virtualization All-Inclusive Self-Assessment enables You to be that person.

    All the tools you need to an in-depth Server Virtualization Self-Assessment. Featuring 955 new and updated case-based questions, organized into seven core areas of process design, this Self-Assessment will help you identify areas in which Server Virtualization improvements can be made.

    In using the questions you will be better able to:

    – diagnose Server Virtualization projects, initiatives, organizations, businesses and processes using accepted diagnostic standards and practices

    – implement evidence-based best practice strategies aligned with overall goals

    – integrate recent advances in Server Virtualization and process design strategies into practice according to best practice guidelines

    Using a Self-Assessment tool known as the Server Virtualization Scorecard, you will develop a clear picture of which Server Virtualization areas need attention.

    Your purchase includes access details to the Server Virtualization self-assessment dashboard download which gives you your dynamically prioritized projects-ready tool and shows your organization exactly what to do next. You will receive the following contents with New and Updated specific criteria:

    – The latest quick edition of the book in PDF

    – The latest complete edition of the book in PDF, which criteria correspond to the criteria in…

    – The Self-Assessment Excel Dashboard

    – Example pre-filled Self-Assessment Excel Dashboard to get familiar with results generation

    – In-depth and specific Server Virtualization Checklists

    – Project management checklists and templates to assist with implementation

    INCLUDES LIFETIME SELF ASSESSMENT UPDATES

    Every self assessment comes with Lifetime Updates and Lifetime Free Updated Books. Lifetime Updates is an industry-first feature which allows you to receive verified self assessment updates, ensuring you always have the most accurate information at your fingertips.


    June 30, 2019  2:26 PM

    Global Software Engineering: Virtualization and Coordination @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Software engineering, Virtualization

    Global Software Engineering: Virtualization and Coordination (Applied Software Engineering Series) by Gamel O. Wiredu

    Link: https://www.amazon.com/Global-Software-Engineering-Virtualization-Coordination-ebook/dp/B07TMG3LYM/ref=sr_1_16?keywords=virtualization&qid=1561883567&s=books&sr=1-16

    Excerpt as on Amazon:

    Technology and organizations co-evolve, as is illustrated by the growth of information and communication technology (ICT) and global software engineering (GSE). Technology has enabled the development of innovations in GSE. The literature on GSE has emphasized the role of the organization at the expense of technology. This book explores the role of technology in the evolution of globally distributed software engineering.

    To date, the role of the organization has been examined in coordinating GSE activities because of the prevalence of the logic of rationality (i.e., the efficiency ethos, mechanical methods, and mathematical analysis) and indeterminacy (i.e., the effectiveness ethos, natural methods, and functional analysis). This logic neglects the coordination role of ICT. However, GSE itself is an organizational mode that is technology-begotten, technology-dominated, and technology-driven, as is its coordination. GSE is a direct reflection of ICT innovation, change, and use, yet research into the role technology of GSE has been neglected.

    Global Software Engineering: Virtualization and Coordination considers existing fragmented explanations and perspectives in GSE research, poses new questions about GSE, and proposes a framework based on the logic of virtuality (i.e., creativity ethos, electrical methods, and technological analysis) rather than of rationality and indeterminacy. Virtuality is the primary perspective in this book’s comprehensive study of GSE. The book concludes with an integrated explanation of GSE coordination made possible through ICT connectivity and capitalization.


    June 30, 2019  2:23 PM

    Docker in Action 2nd Edition @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Docker

    Docker in Action 2nd Edition by Jeff Nickoloff and Stephen Kuenzli

    Link: https://www.amazon.com/Docker-Action-Jeff-Nickoloff/dp/1617294764/ref=sr_1_10?keywords=virtualization&qid=1561883567&s=books&sr=1-10

    Excerpt as on Amazon:

    Even small applications have dozens of components. Large applications may have thousands, which makes them challenging to install, maintain, and remove. Docker bundles all application components into a package called a container that keeps things tidy and helps manage any dependencies on other applications or infrastructure.

    Docker in Action, Second Edition teaches you the skills and knowledge you need to create, deploy, and manage applications hosted in Docker containers. This bestseller has been fully updated with new examples, best practices, and entirely new chapters. You’ll start with a clear explanation of the Docker model and learn how to package applications in containers, including techniques for testing and distributing applications.

    Purchase of the print book includes a free eBook in PDF, Kindle, and ePub formats from Manning Publications.


    June 30, 2019  2:20 PM

    5G Physical Layer Technologies (Wiley – IEEE) @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    5G

    5G Physical Layer Technologies (Wiley – IEEE) by Mosa Ali Abu-Rgheff

    Link: https://www.amazon.com/5G-Physical-Layer-Technologies-Wiley/dp/1119525519/ref=sr_1_8?keywords=virtualization&qid=1561883567&s=books&sr=1-8

    Excerpt as on Amazon:

    Written in a clear and concise manner, this book presents readers with an in-depth discussion of the 5G technologies that will help move society beyond its current capabilities. It perfectly illustrates how the technology itself will benefit both individual consumers and industry as the world heads towards a more connected state of being. Every technological application presented is modeled in a schematic diagram and is considered in depth through mathematical analysis and performance assessment. Furthermore, published simulation data and measurements are checked.

    Each chapter of 5G Physical Layer Technologies contains texts, mathematical analysis, and applications supported by figures, graphs, data tables, appendices, and a list of up to date references, along with an executive summary of the key issues. Topics covered include: the evolution of wireless communications; full duplex communications and full dimension MIMO technologies; network virtualization and wireless energy harvesting; Internet of Things and smart cities; and millimeter wave massive MIMO technology. Additional chapters look at millimeter wave propagation losses caused by atmospheric gases, rain, snow, building materials and vegetation; wireless channel modeling and array mutual coupling; massive array configurations and 3D channel modeling; massive MIMO channel estimation schemes and channel reciprocity; 3D beamforming technologies; and linear precoding strategies for multiuser massive MIMO systems. Other features include:

    In depth coverage of a hot topic soon to become the backbone of IoT connecting devices, machines, and vehicles
    Addresses the need for green communications for the 21st century
    Provides a comprehensive support for the advanced mathematics exploited in the book by including appendices and worked examples
    Contributions from the EU research programmes, the International telecommunications companies, and the International standards institutions (ITU; 3GPP; ETSI) are covered in depth
    Includes numerous tables and illustrations to aid the reader
    Fills the gap in the current literature where technologies are not explained in depth or omitted altogether

    5G Physical Layer Technologies is an essential resource for undergraduate and postgraduate courses on wireless communications and technology. It is also an excellent source of information for design engineers, research and development engineers, the private-public research community, university research academics, undergraduate and postgraduate students, technical managers, service providers, and all professionals involved in the communications and technology industry.


    June 30, 2019  2:17 PM

    Microsoft Azure Infrastructure Services for Architects @Amazon

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Azure

    Microsoft Azure Infrastructure Services for Architects:: Designing Cloud Solutions by John Savill

    Link: https://www.amazon.com/Microsoft-Azure-Infrastructure-Services-Architects/dp/1119596572/ref=sr_1_6?keywords=virtualization&qid=1561883567&s=books&sr=1-6

    Excerpt as on Amazon.com

    An expert guide for IT administrators needing to create and manage a public cloud and virtual network using Microsoft Azure

    With Microsoft Azure challenging Amazon Web Services (AWS) for market share, there has been no better time for IT professionals to broaden and expand their knowledge of Microsoft’s flagship virtualization and cloud computing service. Microsoft Azure Infrastructure Services for Architects: Designing Cloud Solutions helps readers develop the skills required to understand the capabilities of Microsoft Azure for Infrastructure Services and implement a public cloud to achieve full virtualization of data, both on and off premise. Microsoft Azure provides granular control in choosing core infrastructure components, enabling IT administrators to deploy new Windows Server and Linux virtual machines, adjust usage as requirements change, and scale to meet the infrastructure needs of their entire organization.

    This accurate, authoritative book covers topics including IaaS cost and options, customizing VM storage, enabling external connectivity to Azure virtual machines, extending Azure Active Directory, replicating and backing up to Azure, disaster recovery, and much more. New users and experienced professionals alike will:

    Get expert guidance on understanding, evaluating, deploying, and maintaining Microsoft Azure environments from Microsoft MVP and technical specialist John Savill
    Develop the skills to set up cloud-based virtual machines, deploy web servers, configure hosted data stores, and use other key Azure technologies
    Understand how to design and implement serverless and hybrid solutions
    Learn to use enterprise security guidelines for Azure deployment

    Offering the most up to date information and practical advice, Microsoft Azure Infrastructure Services for Architects: Designing Cloud Solutions is an essential resource for IT administrators, consultants and engineers responsible for learning, designing, implementing, managing, and maintaining Microsoft virtualization and cloud technologies.

    The book is yet to release. You can pre-order it.


    June 30, 2019  2:13 PM

    SQL Server Big Data Clusters Revealed: A Book @Amazon to Pre-Order

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Big Data

    Title: SQL Server Big Data Clusters Revealed: The Data Virtualization, Data Lake, and AI Platform by Benjamin Weissman and Enrico van de Laar

    Link: https://www.amazon.com/Server-Data-Clusters-Revealed-Virtualization/dp/1484251091/ref=sr_1_3?keywords=virtualization&qid=1561883567&s=books&sr=1-3

    Editorial Review: As on Amazon.com
    From the Back Cover

    Use this guide to one of SQL Server 2019’s latest and most impactful features―Big Data Clusters―that combines large volumes of non-relational data for analysis along with data stored relationally inside a SQL Server database.

    Big Data Clusters is a feature set covering data virtualization, distributed computing, and relational databases and provides a complete AI platform across the entire cluster environment. This book shows you how to deploy, manage, and use Big Data Clusters. For example, you will learn how to combine data stored on the HDFS file system together with data stored inside the SQL Server instances that make up the Big Data Cluster.

    Filled with clear examples and use cases, SQL Server Big Data Clusters Revealed provides everything necessary to get started working with SQL Server 2019 Big Data Clusters. You will learn about the architectural foundations that are made up from Kubernetes, Spark, HDFS, and SQL Server on Linux. You then are shown how to configure and deploy Big Data Clusters in on-premises environments or in the cloud. Next, you are taught about querying. You will learn to write queries in Transact-SQL―taking advantage of skills you have honed for years―and with those queries you will be able to examine and analyze data from a wide variety of sources such as Apache Spark.

    Through the theoretical foundation provided in this book and easy-to-follow example scripts and notebooks, you will be ready to use and unveil the full potential of SQL Server 2019: combining different types of data spread across widely disparate sources into a single view that is useful for business intelligence and machine learning analysis.

    You will:

    Install, manage, and troubleshoot Big Data Clusters in cloud or on-premise environments
    Analyze large volumes of data directly from SQL Server and/or Apache Spark
    Manage data stored in HDFS from SQL Server as if it were relational data
    Implement advanced analytics solutions through machine learning and AI
    Expose different data sources as a single logical source using data virtualization


    June 30, 2019  1:57 PM

    Virtual Appliance Performance Achieves A New Landmark @StorOne_Inc

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Seagate, SSD, Virtual Appliance

    A video link given at the end of this article will help you understand the things better. Do have a look after you finish the article. The video is about StorONE Virtual Appliance Performance. I am covering here in this article the same along with StorONE’s recent partnership with Seagate that aims to maximize SSD Performance Specs. Basically, a unified virtual appliance with Seagate Drives is an ideal situation for I/O-intensive environments. On 25th June StorONE announces a unique achievement of record speeds using Seagate SSD drives and StorONE’s TRU™ S1 Software Defined Storage solution. StorONE, in fact, in this recent performance testing, combined Seagate’s enterprise-class SSDs with its software in a virtual appliance configuration. The combination, as a matter of fact, attained a breakthrough half a million IOPS with 24 SSDs provided by Seagate running an all enterprise-class data protection features.
    For any such or similar kind of environment what matters most is the seamless throughput. In this particular case, the high-availability, failure-proof VMware cluster attained this phenomenal throughput on random reads (4K) and 180,000 IOPS on random writes (4K). The latency in both cases was less than 0.2 milliseconds. This is a significant achievement as it eliminates the storage performance issues arising due to server virtualization. That makes the StorONE Unified Virtual Appliance configuration with Seagate drives a unique and most appropriate preposition. In this case, StorONE software runs in an ESXi VM that doesn’t require any complicated setup or configuration. It’s a simple environment that uses very little memory and compute. In fact, it doesn’t require any high-end server configurations, huge memory caches, or any other expensive hardware components. This virtual appliance can support all major hypervisor environments that include VMware, Oracle VM, Hyper-V, or KVM.

    Virtual Appliance Performance

    Ravi Naik, CIO and SVP Corporate Strategy at Seagate, says, “ StorONE’s software elicits the extreme performance of our SSDs in meeting the needs of mission-critical applications. With StorONE’s software, users can achieve the maximum utilization of resources from our solid-state drives, getting the best possible TCO via optimized storage capacity and performance.”

    Gal Naor, StorONE co-founder, and CEO says, “Seagate drives offer superior drive design and reliability for enterprise use, and our tests show they offer performance beyond their competitors. We are very proud to have the world’s leading data solutions company as an early investor and partner, on whose drive technology we can reach incredible throughput and deliver all essential data services for managing and protecting data.”

    For a performance demo of the Unified Virtual Appliance with Seagate SSDs visit https://youtu.be/KMDzOzEr79o


    June 16, 2019  11:24 PM

    SwiftStack Data Analytics Solution With Alluxio @SwiftStack @alluxio

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data Analytics, Data storage, SwiftStack

    Do you know what is the common factor among Tata Communications, eBay, Cisco, Technical Assistance Center, Kaidee, Counsyl, Surf Sara, DC Blox, Hepsiburada, DRN, OMRF, Bet365, ESU10, Tieto, TCT, Premiere Digital, Enter, NSS Labs, PayPal, yp.ca, Douglas, Verizon, Pac12, and Burton. All of them are SwiftStack customers. Happy and delighted customers. For years. We are talking about the best Data Analytics solution across the globe.

    Data Analytics Solution

    Source SwiftStack

    Now, let us look at another set of global class companies. These are Wells Fargo, Lenovo, PayPal, DBS, Walmart, Hutai Securities, Myntra, VIPShop, TWO Sigma, Oracle, Comcast, JD.com, Samsung, Netease games, DiDi, Tencent, ESRI, Ctrip, Nielsen, Caesars, Barclays, Baidu, Swisscom, and China Mobile. All of these are Alluxio customers. So when SwiftStack and Alluxio come out together with a solution, it has to be a state-of-the-art unique one. And what the CTOs and CIOs of companies that are not the customers of SwiftStack or Alluxio supposed to do? They must immediately understand what is it all about. And what a world-class Data Analytics Solution is all about.

    Data Analytics Solution

    Source SwiftStack

    Following that three steps are very important for them. They should participate in a live demo related to this. Have a technical deep dive discussion with SwiftStack and Alluxio. Download the product and try it. Because without tasting the pudding you will never be able to understand its power and beauty. So this Alluxio and SwiftStack partnership brings SwiftStack Data Analytics Solution with Alluxio. It is a seamless edge to core to cloud. Current enterprise situation across the globe is quite alarming. Most of the businesses are in a fix. Their dilemma is between on-premise and cloud.

    SwiftStack Data Analytics Solution with Alluxio

    The data is lying in silos and in huge volumes. Existing solutions seem short in promises and commitments. Most of the products in the market lack enterprise readiness. On top of that, in the cloud, OpEx is increasing at a faster pace. The pressures are increasing. Are you safe with current design and solutions?

    Data Analytics Solution

    Source SwiftStack

    None of the existing Data Analytics vendors seem to have an iota of confidence in their products in catering to the four rapidly changing trends viz separation of compute and storage, hybrid multi-cloud environments, the rise of the object store, and self-service data across the enterprise. Data ecosystem will not be the same next year. Look at the Data ecosystem beta version where compute was solely dependent on Hadoop Map Reduce and Storage on Hadoop HDFS.

    Data Analytics Solution

    Source SwiftStack

    The next maturity model was Data Ecosystem 1.0 where Compute had many players like Presto, Spark, Flink, Cafe, Apache HBase, and Tensorflow while Hadoop Map Reduce stayed a strong contender as before. On the other hand, on the storage front, a lot of players emerged like Amazon S3, Azure, Ceph, HPE, IBM, Dell EMC ECS, Hitachi, and Minio in addition to the legendary Hadoop HDFS. Things are changing rapidly.

    Data Analytics Solution

    Source: SwiftStack

    The SwiftStack Data Analytics Solution with Alluxio’s Accelerated Compute, Data accessibility, and Elasticity ensure Multi-cloud storage and Data Management. You name a business use case and there is a seamless solution available. Amita Potnis, Research Director at IDC’s Infrastructure System Platform and Technologies Group says, “Infrastructure challenges are the primary inhibitor for broader adoption of AI/ML workflows. SwiftStack’s Multi-Cloud data management solution is first of its kind in the industry and effectively handles storage I/O challenges faced by the edge to core to cloud, large scale AI/ML data pipelines”


    June 15, 2019  9:45 PM

    Do You Get A Comprehensive View of IT Assets and Activities @Cynet360

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cyber security, cyber-attacks, cyber-crimes, IT asset management

    I am sure keeping a track of IT Assets in the organization is a big pain for all IT Heads. There are assets that are in use. There are assets that are there as standbys. And then there are assets that are old, obsolete, or out of use that are lying idle in some storeroom of the organization. Losing a 1TB laptop with no data in it is not a major issue. But losing even a few MB of data that is crucial and confidential can create a big issue that might lead to a loss of reputation and finances. All this makes a comprehensive view of IT Assets, their activities, and their movements more important. There is nothing like it if the solution comes from one of the top leaders in the industry and that too free of cost. Cynet recently launched free proactive visibility solution.

    IT Assets

    This Proactive Visibility Offering from Cynet is for a comprehensive view of IT Assets of your organization in a real-time environment with all checks and alerts in it. The tool is equally beneficial for service providers as well as enterprises. Whatever you can think of is there in it with detailed inventory reporting and attack surface elimination. This is an on-demand solution empowering an organization to enhance security and productivity tasks on an ongoing basis. In fact, IT/Security decision makers find it an effective catalyst to the tools they use in giving comprehensive visibility to boost critical IT operations and productivity of both end users as well as service providers. The rate, intensity, and severity of cybercrimes across the globe are increasing at a tremendous pace. In a span of the last four years, the cost of cybercrime has increased 4 times which is an alarm raising situation.

    IT Assets

    A recent research report from Juniper titled The Future of Cybercrime & Security: Financial and Corporate Threats & Mitigations estimates the total cost of cybercrime to exceed $2 trillion this year. Larger the attack surface in an organization larger is the number of points or attack vectors hence large is the scale of threats or vulnerabilities arising out of it. A professional hacker or cybercriminal needs just one of these points to penetrate your organization’s databases. A recent research shows around 60 percent of organizations across the globe have over 100,000 folders open for every employee. Cynet’s Proactive Visibility empowers security administrators to enhance the efficiency of security monitoring workflows.

    Eyal Gruner, Founder & President of Cynet says,

    “There is a critical need for a single-source-of-truth where users get a complete visual of both positive and negative actions/processing taking place across their centralized or distributed IT infrastructure. Our free Proactive Visibility Experience delivers the operational reality of having all of this data available with the click of a button, allowing for accurate data-driven decision making.”

    IT Assets

    Key components of Cynet 360 Security Platform includes Cynet Proactive Visibility, Attack Protection, and Response Orchestration. It integrates various globally accepted technologies like NGAV, Network Analytics, EDP, UBA, and deception. As a matter of fact, Cynet is the first vendor to integrate all the essential breach protection possibilities. These consolidated capabilities are then applied to the complete internal environment in a single interface.

    “It’s a rather worn-out phrase that you can’t secure what you don’t know, but it’s true all the same,” added Gruner, “We’re really able to boost organizations in the right direction with our highly available, high-resolution knowledge of the user’s environment. Use of the Proactive Visibility offering is the equivalent of a good opening move in chess because it narrows down the risks that user’s face and enables the enterprise to focus on what really matters.”

    Gruner concludes.

    Whether you are an end-user organization or a service provider, to gain a 14-day access to Cynet 360 platform that includes full visibility into your IT environment, IT Assets, host configurations, user account activities, installed software, network traffic, and password hygiene, click on the link: free access to its end-to-end visibility capabilities.


    June 10, 2019  8:53 PM

    Deskless Workforce Management Becomes Easier with StaffConnect

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    mobile workforce, Remote worker

    The biggest challenge among the enterprises across the globe is to squeeze the gap between the important information related to the organization and its mobile employees or deskless workforce. This challenge became a thing of the past for the organizations who adopted StaffConnect well in time. For those who already tasted the pudding are reaping the fruits. For those who haven’t deployed StaffConnect so far, it is a golden chance to go for StaffConnect Next-Generation Mobile Employee Engagement Platform. Why is it necessary? Well, the first and foremost is the organizations with remote workers is struggling to Tackle Employee Engagement Crisis. The second important point is StaffConnect Inspiring The Emotionally Connected Enterprises. Otherwise, it becomes a pain to fill the gap between the emotional disconnect between the employees and the organization. The launch of StaffConnect v2.4 brings enhanced enterprise features with powerful analytics.

    Besides enhanced enterprise features and powerful analytics, StaffConnect v2.4 also comes with advanced functionality for segmenting users and delivering targeted content and messages to the mobile workforce of an organization irrespective of their location. StaffConnect is a pioneer of mobile employee engagement solutions for the deskless workforce. StaffConnect v2.4 has become more powerful with features such as 5,000 unique groups assignable to 100 unique categories. Administrators can now easily and efficiently create targeted content that is not only relevant and meaningful but can also be shared quickly and directly with targeted audiences across an organization. Now, those targeted audiences include employees, department, roles, building, etc. What it means is that with StaffConnect v2.4, each feed can be made as complex or as simple as per the requirement of the organization. Each feed is easily customizable and configurable by the administrators without any assistance.

    Deskless workforce

    How Can Enterprises Overcome the Global Employee Engagement Crisis That Impacts 2.7 Billion Deskless Employees” is a very interesting eBook released by StaffConnect. It comes with astonishing facts about its 80 percent of the global workforce is deployed offsite or remotely. The eBook provides ample statistics and insights on the various angles on this aspect highlighting the pain areas and financial losses such organizations are making due to its deskless workforce not being connected well with the organization. Betsi Cadwaladr University Health Board (BCUHB) is one of the key customers of StaffConnect.

    Aaron Haley, Communications Officer at BCUHB says, “While email works for our desk-based staff, there’s a big contingency of our workforce who just can’t find the time to get on to a computer as part of their working day. We wanted the new internal communications tool to be completely voluntary, and we wanted to demonstrate our commitment to improving internal communications with a platform that meets the needs of all our employees, regardless of their role or location. StaffConnect ticked all of those boxes.”

    Deskless Workforce

    Geraldine Osman, CMO, StaffConnect says, “StaffConnect v2.4 offers a powerful combination of advanced analytics and sophisticated audience segmentation capabilities in order to enable our large global enterprise customers to create, tailor and deliver highly targeted content to specific audiences. StaffConnect v2.4’s audience and content analytics then work across the platform and across the organization to track and measure every aspect of the communications program, enabling HR and Communications professionals to fine-tune strategy and ensure organizational goals are met.”

    Osman continues, “With StaffConnect v2.4 HR and communications can now deliver an enhanced employee experience (EX) by delivering relevant and personalized information which creates deeper engagement. Highly engaged employees are more committed to company goals, productive and dedicated to ensuring optimal customer experiences (CX). Organizations that consistently deliver superior CX are proven to earn and enjoy increased revenues, profits and shareholder value.”


    June 9, 2019  9:23 PM

    How To Push Boundaries of Enterprise SSDs With @ViolinSystems

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Enterprise storage, Flash Array, NVM, SSD, Violin Systems

    When you think of Extreme Performance Storage or Enterprise SSDs, it is Violin Systems. Otherwise, why two pioneers in their respective fields would decide to form a strategic development alliance for the purpose of joint development in further advancing performance-based SSDs with the integration of Violin advanced technologies. The two masters to enter into a strategic partnership to integrate Phison’s E12DC controller with Violin’s state-of-the-art technology and extreme-performance storage systems are Violin Systems and Phison Electronics Corporation. Violin Systems is a global leader in all-flash arrays. The company has been leading the market in innovating consistent extreme performance its unique Flash Fabric Architecture and vRAID simultaneously ensuring continuity of operations aided by its advanced enterprise data services. Violin is continuously evolving from custom to customized SSDs without compromising performance or resiliency. The industry standards of NVMs are compelled to catch-up to the level of Violin’s proprietary technology.

    Phison is among the top leaders in NAND flash controllers and applications. It ships more than 600 million controllers annually. That itself demonstrates its capabilities and the top position in enterprise-class technology. While Violin is one of Phison’s customers, it is also a joint technology partner of Phison. Both working collaboratively in performance optimization. With the NVMe standard interface enhancement with its proprietary performance-enhancing technologies, Violin performance storage solution along with its advanced enterprise software has reached to an unmatched level. While promising to provide consistent extreme performance for industry’s top requirements like Enterprise SSDs, OLTP, real-time analytics, virtualization applications, SQL databases, etc., it goes a level up in providing support to AI, M2M learning, and IIoT applications. All this is happening with zero tolerance of any performance tag.

    Enterprise SSDs achieve a new paradigm

    K.S. Pua, CEO, Phison says, “We aim to be our customers’ most dependable IT business partner. Phison is constantly striving to find new ways to integrate capabilities into our product sets that add true value for our customers. The Phison E12DC enterprise controller builds on the award-winning success of the E12 solution by adding robust algorithms to ensure predictable latency and consistent I/O. With the E12 already strongly established in the consumer space, we are able to accelerate our growth in the enterprise-embedded market. We believe that partnering with Violin and leveraging each other’s technology and expertise allows us to make a big impact on the market.”

    Eric Burgener, research vice president, Infrastructure Systems, Platforms and Technologies Group, IDC says, “The tight collaboration of extreme-performance storage systems expert Violin with resources of an industry leader in flash controllers and NAND solutions like Phison is a great move that can enable rapid innovation working with standards-based interfaces.”

    Mark Lewis, Chairman CEO of Violin Systems says, “While NVMe is becoming a growing standard in performance storage, we have been able to further enhance the advancements made with the technology by leveraging our patented technology to provide enterprises extreme, consistent performances. This joint development with Phison allows both companies to push the boundaries of enterprise SSDs and bring true innovation to the market through standards-based technology vs. proprietary technology.”


    May 29, 2019  10:23 PM

    5 Books on Agile Testing @Amazon #AgileTesting

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Agile Software, Agile testing

    Book 1:

    Agile Testing: A Practical Guide for Testers and Agile Teams by Lisa Crispin and Janet Gregory

    Synopsis from Amazon:

    Two of the industry’s most experienced agile testing practitioners and consultants, Lisa Crispin and Janet Gregory, have teamed up to bring you the definitive answers to these questions and many others. In Agile Testing, Crispin and Gregory define agile testing and illustrate the tester’s role with examples from real agile teams. They teach you how to use the agile testing quadrants to identify what testing is needed, who should do it, and what tools might help. The book chronicles an agile software development iteration from the viewpoint of a tester and explains the seven key success factors
    of agile testing.

    Readers will come away from this book understanding

    How to get testers engaged in agile development
    Where testers and QA managers fit on an agile team
    What to look for when hiring an agile tester
    How to transition from a traditional cycle to agile development
    How to complete testing activities in short iterations
    How to use tests to successfully guide development
    How to overcome barriers to test automation

    This book is a must for agile testers, agile teams, their managers, and their customers.

    The eBook edition of Agile Testing also is available as part of a two-eBook collection, The Agile Testing Collection (9780134190624).

    Book 2: More Agile Testing: Learning Journeys for the Whole Team (Addison-Wesley Signature Series (Cohn)) by Janet Gregory and Lisa Crispin

    Synopsis from Amazon:

    Janet Gregory and Lisa Crispin pioneered the agile testing discipline with their previous work, Agile Testing. Now, in More Agile Testing, they reflect on all they’ve learned since. They address crucial emerging issues, share evolved agile practices, and cover key issues agile testers have asked to learn more about.

    Packed with new examples from real teams, this insightful guide offers detailed information about adapting agile testing for your environment; learning from experience and continually improving your test processes; scaling agile testing across teams; and overcoming the pitfalls of automated testing. You’ll find brand-new coverage of agile testing for the enterprise, distributed teams, mobile/embedded systems, regulated environments, data warehouse/BI systems, and DevOps practices.

    You’ll come away understanding

    • How to clarify testing activities within the team

    • Ways to collaborate with business experts to identify valuable features and deliver the right capabilities

    • How to design automated tests for superior reliability and easier maintenance

    • How agile team members can improve and expand their testing skills

    • How to plan “just enough,” balancing small increments with larger feature sets and the entire system

    • How to use testing to identify and mitigate risks associated with your current agile processes and to prevent defects

    • How to address challenges within your product or organizational context

    • How to perform exploratory testing using “personas” and “tours”

    • Exploratory testing approaches that engage the whole team, using test charters with session- and thread-based techniques

    • How to bring new agile testers up to speed quickly–without overwhelming them

    The eBook edition of More Agile Testing also is available as part of a two-eBook collection, The Agile Testing Collection (9780134190624).


    May 29, 2019  10:16 PM

    2 Books on DevOps @Amazon #DevOps #DevOpsBooks

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    DevOps

    The DevOps Handbook:: How to Create World-Class Agility, Reliability, and Security in Technology Organizations by Gene Kim, Jez Humble, et al.

    Synopsis from Amazon:

    Increase profitability, elevate work culture and exceed productivity goals through DevOps practices.

    More than ever, effective management of technology is critical for business competitiveness. For decades, technology leaders have struggled to balance agility, reliability, and security. The consequences of failure have never been greater―whether it’s the healthcare.gov debacle, cardholder data breaches, or missing the boat with Big Data in the cloud.

    And yet, high performers using DevOps principles, such as Google, Amazon, Facebook, Etsy, and Netflix, are routinely and reliably deploying code into production hundreds, or even thousands, of times per day.

    Following in the footsteps of The Phoenix Project, The DevOps Handbook shows leaders how to replicate these incredible outcomes, by showing how to integrate Product Management, Development, QA, IT Operations, and Information Security to elevate your company and win in the marketplace.

    Book 2: What is DevOps? by Mike Loukides

    Synopsis from Amazon:

    Have we entered the age of NoOps infrastructures? Hardly. Old-style system administrators may be disappearing in the face of automation and cloud computing, but operations have become more significant than ever. As this O’Reilly Radar Report explains, we’re moving into a more complex arrangement known as “DevOps.”

    Mike Loukides, O’Reilly’s VP of Content Strategy, provides an incisive look into this new world of operations, where IT specialists are becoming part of the development team. In an environment with thousands of servers, these specialists now write the code that maintains the infrastructure. Even applications that run in the cloud have to be resilient and fault tolerant, need to be monitored, and must adjust to huge swings in load. That was underscored by Amazon’s EBS outage last year.

    From the discussions at O’Reilly’s Velocity Conference, it’s evident that many operations specialists are quickly adapting to the DevOps reality. But as a whole, the industry has just scratched the surface. This report tells you why.


    May 29, 2019  10:10 PM

    A Book on Test Automation @Amazon #TestAutomation

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Test Automation

    Book 1: Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation (Adobe Reader) (Addison-Wesley Signature Series (Fowler)) by Jez Humble and David Farley

    Synopsis from Amazon:

    Winner of the 2011 Jolt Excellence Award!

    Getting software released to users is often a painful, risky, and time-consuming process.

    This groundbreaking new book sets out the principles and technical practices that enable

    rapid, incremental delivery of high quality, valuable new functionality to users. Through

    automation of the build, deployment, and testing process, and improved collaboration between

    developers, testers, and operations, delivery teams can get changes released in a matter of hours—

    sometimes even minutes–no matter what the size of a project or the complexity of its code base.

    Jez Humble and David Farley begin by presenting the foundations of a rapid, reliable, low-risk

    delivery process. Next, they introduce the “deployment pipeline,” an automated process for

    managing all changes, from check-in to release. Finally, they discuss the “ecosystem” needed to

    support continuous delivery, from infrastructure, data and configuration management to governance.

    The authors introduce state-of-the-art techniques, including automated infrastructure management

    and data migration, and the use of virtualization. For each, they review key issues, identify best

    practices, and demonstrate how to mitigate risks. Coverage includes

    • Automating all facets of building, integrating, testing, and deploying software

    • Implementing deployment pipelines at team and organizational levels

    • Improving collaboration between developers, testers, and operations

    • Developing features incrementally on large and distributed teams

    • Implementing an effective configuration management strategy

    • Automating acceptance testing, from analysis to implementation

    • Testing capacity and other non-functional requirements

    • Implementing continuous deployment and zero-downtime releases

    • Managing infrastructure, data, components and dependencies

    • Navigating risk management, compliance, and auditing

    Whether you’re a developer, systems administrator, tester, or manager, this book will help your

    organization move from idea to release faster than ever—so you can deliver value to your business

    rapidly and reliably.


    May 29, 2019  9:58 PM

    2 Books on Software Testing @Amazon #SoftwareTesting

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Software testing

    Book 1: Software Testing by Ron Patton

    Synopsis from Amazon:

    Software Testing, Second Edition provides practical insight into the world of software testing and quality assurance. Learn how to find problems in any computer program, how to plan an effective test approach and how to tell when the software is ready for release. Updated from the previous edition in 2000 to include a chapter that specifically deals with testing software for security bugs, the processes, and techniques used throughout the book are timeless. This book is an excellent investment if you want to better understand what your Software Test team does or you want to write better software.

    Book 2: Lessons Learned in Software Testing: A Context-Driven Approach by Cem Kaner, James Bach, et al.

    Synopsis from Amazon:

    Decades of software testing experience condensed into the most important lessons learned.

    The world’s leading software testing experts lend you their wisdom and years of experience to help you avoid the most common mistakes in testing software. Each lesson is an assertion related to software testing, followed by an explanation or example that shows you the how, when, and why of the testing lesson. More than just tips, tricks, and pitfalls to avoid, Lessons Learned in Software Testing speeds you through the critical testing phase of the software development project without the extensive trial and error it normally takes to do so. The ultimate resource for software testers and developers at every level of expertise, this guidebook features:
    * Over 200 lessons gleaned from over 30 years of combined testing experience
    * Tips, tricks, and common pitfalls to avoid by simply reading the book rather than finding out the hard way
    * Lessons for all key topic areas, including test design, test management, testing strategies, and bug reporting
    * Explanations and examples of each testing trouble spot help illustrate each lesson’s assertion


    May 29, 2019  9:49 PM

    5 Books for Project Managers @Amazon #ProjectManager

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Project Manager

    Book 1: A Project Manager’s Book of Forms: A Companion to the PMBOK Guide by Snyder Dionisio, Cynthia

    Book 2: The New One-Page Project Manager: Communicate and Manage Any Project With A Single Sheet of Paper by Clark A. Campbell and Mick Campbell

    Synopsis on Amazon:

    How to manage any project on just one piece of paper

    The New One-Page Project Manager demonstrates how to efficiently and effectively communicate essential elements of a project’s status. The hands of a pocket watch reveal the time of day without following every spring, cog, and movement behind the face. Similarly, an OPPM template reduces any project—no matter how large or complicated—to a simple one-page document, perfect for communicating to upper management and other project stakeholders. Now in its Second Edition, this practical guide, currently saving time and effort in thousands of organizations worldwide, has itself been simplified, then refined and extended to include the innovative AgileOPPM™

    This Second Edition will include new material and updates including an introduction of the ground-breaking AgileOPPM™ and an overview of MyOPPM™ template builder, available online
    Includes references throughout the book to the affiliated sections in the Project Management Body of Knowledge (PMBOK®)
    Shows templates for the Project Management Office (PMO)

    This new and updated Second Edition will help you master the one-page approach to both traditional project management and Agile project management.

    (PMBOK is a registered mark of the Project Management Institute, Inc.)

    Book 3: The lazy project manager, second edition by Peter Taylor

    Synopsis from Amazon:

    Peter Taylor reveals how adopting a more focused approach to life, projects and work can make you twice as productive.
    The Lazy Project Manager has been the project management book to own in the last six years and now this new edition brings the art of lazy productivity bang up to date. Anyone can apply the simple techniques of lazy project management to their own activities in order to work more effectively and improve their work-life balance. By concentrating your project management and learning to exercise effort where it really matters, you can learn to work smarter. Welcome to the home of ‘productive laziness’. Inside this insightful and informative book you’ll discover:
    • The intelligence of laziness – why smart, lazy people have the edge over others;
    • Why The Jungle Book’s ‘Bare Necessities’ should be the productive lazy theme tune;
    • How to get the maximum output for a minimized input;
    • Quick tips to productive lazy heaven, including avoiding project surprises and being lazy on several projects at once.
    You’ll also find out why you should never go ballooning, how to deliver a good Oscar acceptance speech, and why it is important for your team that you read the newspaper each morning. And yes, you may even learn some, quick, simple but incredibly important things about project management. If you are lazy enough.


    May 29, 2019  9:40 PM

    3 Books On Project Management On @Amazon #ProjectManagement

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Project management

    Book 1: Project Management for the Unofficial Project Manager: A FranklinCovey Title by Kory Kogon, Suzette Blakemore, and James Wood

    Its synopsis on Amazon reads as:

    No project management training? No problem!

    In today’s workplace, employees are routinely expected to coordinate and manage projects. Yet, chances are, you aren’t formally trained in managing projects—you’re an unofficial project manager.

    FranklinCovey experts Kory Kogon, Suzette Blakemore, and James Wood understand the importance of leadership in project completion and explain that people are crucial in the formula for success.

    Project Management for the Unofficial Project Manager offers practical, real-world insights for effective project management and guides you through the essentials of the people and project management process:

    Initiate
    Plan
    Execute
    Monitor/Control
    Close

    Unofficial project managers in any arena will benefit from the accessible, engaging real-life anecdotes, memorable “Project Management Proverbs,” and quick reviews at the end of each chapter.

    If you’re struggling to keep your projects organized, this book is for you. If you manage projects without the benefit of a team, this book is also for you. Change the way you think about project management—”project manager” may not be your official title or necessarily your dream job, but with the right strategies, you can excel.

    Book 2:

    A Guide to the Project Management Body of Knowledge (PMBOK® Guide)–Sixth Edition by Project Management Institute

    Book 3: The Fast Forward MBA in Project Management (Fast Forward MBA Series) by Eric Verzuh

    Synopsis from Amazon:

    The all-inclusive guide to exceptional project management

    The Fast Forward MBA in Project Management is a comprehensive guide to real-world project management methods, tools, and techniques. Practical, easy-to-use, and deeply thorough, this book gives you the answers you need now. You’ll find the cutting-edge ideas and hard-won wisdom of one of the field’s leading experts, delivered in short, lively segments that address common management issues. Brief descriptions of important concepts, tips on real-world applications, and compact case studies illustrate the most sought-after skills and the pitfalls you should watch out for. This new fifth edition features new case studies, new information on engaging stakeholders, change management, new guidance on using Agile techniques, and new content that integrates current events and trends in the project management sphere.

    Project management is a complex role, with seemingly conflicting demands that must be coordinated into a single, overarching, executable strategy — all within a certain time, resource, and budget constraints. This book shows you how to get it all together and get it done, with expert guidance every step of the way.

    Navigate complex management issues effectively
    Master key concepts and real-world applications
    Learn from case studies of today’s leading experts
    Keep your project on track, on time, and on budget

    From finding the right sponsor to clarifying objectives to setting a realistic schedule and budget projection, all across different departments, executive levels, or technical domains, project management incorporates a wide range of competencies. The Fast Forward MBA in Project Management shows you what you need to know, the best way to do it, and what to watch out for along the way.


    May 28, 2019  11:20 PM

    Future of Voice Based Assistants – Ideas of 2 Key Players @Microsoft

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Bot, Cortana, Microsoft

    Jonathan Foster and Deborah Harrison, both are non-technical by qualification and profession. Yet they hold an important position in one of the top technology companies in the world – Microsoft. Both are jointly engaged in a special task. They are designing a unique personality for various voice-based personal assistants at Microsoft. For instance one of the key projects is to create a personality for Microsoft’s Cortana that is one of their AI-powered assistants holding a lot of promises for the coming future. Foster has been writing for film, theatre, and television for last many years. He is leading a team at Microsoft. The team is responsible for creating guidelines for AI-powered conversational bots and assistants across various Microsoft platforms. Harrison is a part of Foster’s team. She was the first to build the user interface for Cortana.

    Deborah says, “For quite some time, in the beginning, I was the only person who was on the writing team for Cortana. It was a pretty forward-thinking feature. There is no relationship between writing for a digital agent and writing for any other user interface except for the fact that it’s all words and I’m trying to create a connection. Initially, we were looking at straightforward strings because we started with some scenarios like setting an alarm or checking the calendar. But while writing those strings, we started thinking about what it would sound like and we realized that the agent should have a more concrete identity so that we could tell what to say when and under what circumstances it should sound apologetic versus more confident and so on.”

    Foster says, “We don’t want to get into a situation where we’re creating life-like interaction models that are addictive. Tech can move in that direction when you are so excited about the potential of what you can build that you’re not thinking about its impact. Thankfully, Microsoft has been a leader in ethics all up and we are a mature company that can pause and think about this stuff.”

    You can read the complete interview here: https://news.microsoft.com/en-in/features/building-personalities-ai-jonathan-foster-deborah-harrison/


    May 23, 2019  10:39 PM

    5 Reasons To Join Digital Predictive Maintenance Conference

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Predictive maintenance

    So, whether you are joining me in the upcoming Digital Predictive Maintenance Conference Bangkok September 2019 or not is solely your discretion. But it is my privilege to share some good reasons why you should attend. We all strive to avoid equipment failures irrespective of whether we are a product or a service company. After all, it is not a question of survival or existence but of excelling and leading the tough race with peer businesses. Digital Predictive Maintenance can help us to a large extent in that. We all rely on equipment in whatever industry we work in. In the factory, if it is machine and tools, in the office it is laptops, network, servers, internet, and many other types of equipment. These days we can get real-time monitoring along with an ability to analyze large volumes of relevant data.

    This analysis can help in predicting equipment failure with instant notification to concerned stakeholders to come into appropriate actions and avoid any delays. Network remote sensors, cloud systems, machine learning, internet of things, and big data are some of the emerging technologies to empower digital predictive maintenance. The primary application of industrial analytics in the coming years will be the predictive maintenance of industrial machinery. Unplanned downtime caused mostly by equipment failure costs $50 billion per year to industrial manufacturers. This is a study concluded recently by The Wall Street Journal and Emerson. Digital Predictive Maintenance not only saves up to 40% on maintenance costs but it also reduces capital investments in new equipment to an extent of 5%. Industry professionals must learn about proven technological concepts in predictive maintenance so as to deploy the most efficient maintenance policy and procedures to optimize equipment reliability, profits, and production uptime.

    Comment to know more about Digital Predictive Maintenance Conference

    Digital Predictive Maintenance Conference in Bangkok will have more than 60 speakers and 300 delegates. There will be 4 parallel events running that include Sensor Tech, Digital Shutdowns and Turnarounds, Digital EPC, and Digital Predictive Maintenance.


    May 19, 2019  9:49 PM

    Smart Tech 2019 Focuses On Present And Future of VTS @itriangleindia

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Connected vehicles, Smart cities, smart city

    The key focus on the two days Smart Tech 2019 event organized by Smart Mobility Association was on the Present and Future of Vehicle Tracking Technology. Most of the talks and panel discussions revolved around the various aspect of the topic. This was a 2-day industry symposium and hence there was a high level of industry level present at the event not only from the government sector but also from public and private sectors. It emerges that modern-day automotive can’t survive without Technology, Telecom, and mobility sectors. Only then it can evolve as smart automotive. The event was presented by iTriangle Infotech which is India’s largest manufacturer of vehicle telematics devices. The symposium on Vehicle Tracking Technology was very much the need of the hour in order to collaborate and synchronize all industry leaders in this ecosystem. The Smart Mobility Association was quite successful in its initiative.

    Smart Tech 2019

    If you try to find out the various developments taking place in vehicle tracking technology spectrum across the globe, the situation is very dynamic. In a way, what is happening today is like setting a proper foundation for the future of navigation, connectivity, telecom, technology, and smart transport. All these further lead to smart cities and smart nation. There is a lot of thrust in the country on digitization. Almost Rs. 2 trillion spends are in pipeline for the development of smart cities in India. A lot of focus is thus towards the basic things to be in the place like road construction, traffic management, fleet tracking, etc. A large chunk of public transport is in the process of installation of VT Technology. Smart Tech 2019 is one of its kind event in this regard covering it with 360 degrees perspective including promises, perspectives, challenges, scope, and risks.

    Smart Tech 2019

    There were more than 250 delegates from various industry segments like Automotive, Telecom, IT, Mobility, etc, in the recently concluded Smart Tech 2019 with a key focus on Vehicle Tracking System. The participation included Ministry of Road Transport and Highways of The Government of India, Automotive Research Association of India (ARAI), International Centre for Automotive Technology (ICAT), Delhi Integrated Multi-Modal Transit System (DIMTS) Limited, and a number of big names in the industry. Rajiv Arora, General Secretary, Smart Mobility Association, says, “The economic scenario of today demands increased productivity while driving down costs, and many businesses are now looking for innovative ways to refine their processes. We are hopeful that this forum shall offer a cross-industry perspective towards this direction, and put forth best practices of vehicle tracking technology in India as well as across the globe,”

    Dinesh Tyagi, Director, ICAT said, “Off late, a dire need has been felt to have platforms where discussions can be initiated about new age smart devices and technologies. I am sure Smart Tech 2019 will address this issue and provide a comprehensive knowledge platform for all the industry experts and professionals alike.”


    Forgot Password

    No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

    Your password has been sent to: