Quality Assurance and Project Management


August 6, 2019  11:37 PM

PCIe Gen4 Storage Empowered with a Portfolio of Products From Phison

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
NVMe, PCIe

Phison has become a synonym with its industry-first products that prove its leadership in the global market. This time it is about hitting the PCIe Gen4 Storage Market with a portfolio of products. As a matter of fact, Phison is the first and only company shipping PCIe® Gen4x4 NVMe SSD solutions. That makes it the industry leader that has enabled high-performance computing for high-speed, high-volume, bandwidth-hungry applications catering to millions of transactions and volumes of data movements. Phison has become a landmark in itself and a benchmark for others in developing these solutions. The solutions that it is providing are raising the bar to a new level of applications expectations to meet the requirement of faster and higher-definition digital transactions. In fact, these expectations are increasing exponentially with the faster adoption of newer technologies like big data, Internet of Things, Machine Learning, Artificial Intelligence, and Virtual/Augmented Reality.

Very few of us know that the PCIe 4.0 standard has double the data transfer rate than its predecessor PCIe 3.0. This doubled data transfer rate along with signal reliability and integrity of PCIe Gen4 empowers technology solutions providers with the capabilities of delivering higher performance, enhanced flexibility, and decreased latency to a huge volume of applications that includes PC, mobile, gaming, networking, and storage.

Chris Kilburn, corporate vice president, and general manager, Client Channel, AMD says, “There is continued pressure in the industry to improve the performance of computing systems to support the applications that end users are most interested in. PCIe 4.0 offers manufacturers a way to meet these consumer demands. AMD is delighted to work with Phison to raise the bar by introducing first-to-market solutions. Through sound engineering and design, we are working together to deliver the experiences our customers demand.”

PCIe Gen4 Storage

Sumit Puri CEO and Co-Founder of Liqid says, “PCIe Gen4 will unleash the performance capabilities required for next-generation data-centric applications, including artificial intelligence and 5G edge computing. The LQD4500 provides 32TB of capacity and a PCIe Gen4x16 interface that enables over 24GB/s of throughput and 4 million IOPS. This impressive performance is only possible by aggregating multiple Phison E16 NVMe controllers into a single device. The Phison E16 provides industry-leading performance, capacity and NVMe features required to build the PCIe Gen4 enabled data center of the future.  Liqid is excited that the Phison E16 is now powering the fastest storage in the world, the LQD4500.”

PCIe

Phison’s stronghold in-memory technology, innovation in flash memory products, and excellent in engineering are the key factors to establish it as a market leader known for its first-to-market expertise. While most of its peers in the market are yet to debut in Gen4 solutions, Phison has achieved the development of a package of products to cater to multiple sockets within the consumer space. The range of new portfolio of PCIe Gen4x4 NVMe products will be released within a year’s timeframe that includes PS5016-E16, PS5019-E19T, and PS5018-E18. Their availability will happen in the same order.

PCIe

K.S. Pua, CEO of Phison Electronics says, “After several years since the announcement of the standard, the era of PCIe 4.0 solutions is upon us and Phison is at the forefront of this movement with our portfolio of Gen4x4 solutions. We pride ourselves with our long history of innovation supporting emerging technologies. From doubling transfer rates to improving power consumption to increasing performance, Phison-based SSD solutions allow our integration partners to deliver the next-generation PC, gaming and storage systems needed to satisfy increasing consumer demand.”

Phison is showcasing its products at the Flash Memory Summit (FMS), August 6-8 in Booth No. 219 at the Santa Clara Convention Center in Santa Clara, California.

July 31, 2019  11:49 PM

Designing Data-Intensive Applications @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems 1st Edition, Kindle Edition by Martin Kleppmann

Excerpt as on Amazon.com

Data is at the center of many challenges in system design today. Difficult issues need to be figured out, such as scalability, consistency, reliability, efficiency, and maintainability. In addition, we have an overwhelming variety of tools, including relational databases, NoSQL datastores, stream or batch processors, and message brokers. What are the right choices for your application? How do you make sense of all these buzzwords?

In this practical and comprehensive guide, author Martin Kleppmann helps you navigate this diverse landscape by examining the pros and cons of various technologies for processing and storing data. The software keeps changing, but the fundamental principles remain the same. With this book, software engineers and architects will learn how to apply those ideas in practice, and how to make full use of data in modern applications.

Peer under the hood of the systems you already use and learn how to use and operate them more effectively
Make informed decisions by identifying the strengths and weaknesses of different tools
Navigate the trade-offs around consistency, scalability, fault tolerance, and complexity
Understand the distributed systems research upon which modern databases are built
Peek behind the scenes of major online services, and learn from their architectures

From the Publisher:

Who Should Read This Book?
If you develop applications that have some kind of server/backend for storing or processing data, and your applications use the internet (e.g., web applications, mobile apps, or internet-connected sensors), then this book is for you.

This book is for software engineers, software architects, and technical managers who love to code. It is especially relevant if you need to make decisions about the architecture of the systems you work on—for example if you need to choose tools for solving a given problem and figure out how best to apply them. But even if you have no choice over your tools, this book will help you better understand their strengths and weaknesses.

You should have some experience building web-based applications or network services, and you should be familiar with relational databases and SQL. Any non-relational databases and other data-related tools you know are a bonus, but not required. A general understanding of common network protocols like TCP and HTTP is helpful. Your choice of programming language or framework makes no difference for this book.

If any of the following are true for you, you’ll find this book valuable:
You want to learn how to make data systems scalable, for example, to support web or mobile apps with millions of users.
You need to make applications highly available (minimizing downtime) and operationally robust.
You are looking for ways of making systems easier to maintain in the long run, even as they grow and as requirements and technologies change.
You have a natural curiosity for the way things work and want to know what goes on inside major websites and online services. This book breaks down the internals of various databases and data processing systems, and it’s great fun to explore the bright thinking that went into their design.

Sometimes, when discussing scalable data systems, people make comments along the lines of, ‘You’re not Google or Amazon. Stop worrying about scale and just use a relational database’. There is truth in that statement: building for scale that you don’t need is wasted effort and may lock you into an inflexible design. In effect, it is a form of premature optimization. However, it’s also important to choose the right tool for the job, and different technologies each have their own strengths and weaknesses. As we shall see, relational databases are important but not the final word on dealing with data.

Scope of This Book
This book does not attempt to give detailed instructions on how to install or use specific software packages or APIs since there is already plenty of documentation for those things. Instead, we discuss the various principles and trade-offs that are fundamental to data systems, and we explore the different design decisions taken by different products.

We look primarily at the architecture of data systems and the ways they are integrated into data-intensive applications. This book doesn’t have space to cover deployment, operations, security, management, and other areas—those are complex and important topics, and we wouldn’t do them justice by making them superficial side notes in this book. They deserve books of their own.

Many of the technologies described in this book fall within the realm of the Big Data buzzword. However, the term ‘Big Data’ is so overused and underdefined that it is not useful in a serious engineering discussion. This book uses less ambiguous terms, such as single-node versus distributed systems, or online/interactive versus offline/batch processing systems.

This book has a bias toward free and open-source software (FOSS) because reading, modifying, and executing source code is a great way to understand how something works in detail. Open platforms also reduce the risk of vendor lock-in. However, where appropriate, we also discuss proprietary software (closed-source software, software as a service, or companies’ in-house software that is only described in the literature but not released publicly).


July 31, 2019  11:41 PM

NAND Flash Memory Technologies @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

NAND Flash Memory Technologies (IEEE Press Series on Microelectronic Systems) 1st Edition, Kindle Edition by Seiichi Aritome (Author)

Offers a comprehensive overview of NAND flash memories, with insights into NAND history, technology, challenges, evolutions, and perspectives

Describes new program disturb issues, data retention, power consumption, and possible solutions for the challenges of 3D NAND flash memory

Written by an authority in NAND flash memory technology, with over 25 years’ experience

From the Back Cover
Examines the history, basic structure, and processes of NAND flash memory

This book discusses basic and advanced NAND flash memory technologies, including the principle of NAND flash, memory cell technologies, multi-bits cell technologies, scaling challenges of the memory cell, reliability, and 3-dimensional cell as the future technology. Chapter 1 describes the background and early history of NAND flash. The basic device structures and operations are described in Chapter 2. Next, the author discusses the memory cell technologies focused on scaling in Chapter 3 and introduces the advanced operations for multi-level cells in Chapter 4. The physical limitations for scaling are examined in Chapter 5, and Chapter 6 describes the reliability of NAND flash memory. Chapter 7 examines 3-dimensional (3D) NAND flash memory cells and discusses the pros and cons in structure, process, operations, scalability, and performance. In Chapter 8, challenges of 3D NAND flash memory are discussed. Finally, in Chapter 9, the author summarizes and describes the prospect of technologies and market for the future NAND flash memory.

Offers a comprehensive overview of NAND flash memories, with insights into NAND history, technology, challenges, evolutions, and perspectives
Describes new program disturb issues, data retention, power consumption, and possible solutions for the challenges of 3D NAND flash memory
Written by an authority in NAND flash memory technology, with over 25 years’ experience
NAND Flash Memory Technologies is a reference for engineers, researchers, and designers who are engaged in the development of NAND flash memory or SSD (Solid State Disk) and flash memory systems.

About the Author
Seiichi Aritome was a Senior Research Fellow at SK Hynix Inc. in Icheon, Korea from 2009 to 2014. He has contributed to NAND flash memory technologies for over 27 years in several companies and nations. Aritome was a Program Director at Powerchip Semiconductor Corp. in Hsinchu, Taiwan, a Senior Process Reliability Engineer at Micron Technology Inc. in Idaho, USA, and a Chief Specialist at Toshiba Corporation in Kawasaki, Japan. He received his Ph.D. from Graduate School of Advanced Sciences of Matter, Hiroshima University, Japan. Aritome is an IEEE Fellow and a member of the IEEE Electron Device Society.


July 31, 2019  11:01 PM

Phison Showcases Technology Innovation Leadership at FMS2019

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Flash memory, NVMe, PCIe, PCIe SSD, SSD

If you are there at Flash Memory Summit which is from August 6 to August 8 at the Santa Clara Convention Centre in Santa Clara, California, then don’t forget to visit Booth number 219 because you are going to witness one of the most innovative technologies there. Phison is one of the pioneer company that is giving the best of the class SSD solutions. In fact, Phison is the only company with PCIe Gen4x4 NVMe SSD solutions at Flash Memory Summit. There will be a lot of partner demos and interesting panel participation at the summit. Phison brings Technology innovation leadership on full display at Flash Memory Summit 2019. Phison electronics is the industry leader in the flash controller and NAND solutions. It will showcase its lineup of PCIe Gen4 SSD solutions that include the public debut of its power-conscious PS5019-E19T controller at both number 219.

Phison

As a matter of fact, it is the first and only company ready with PCIe Gen4x4 NVMe SSD solutions. There will be a demonstration of how its controllers push the boundaries of low power consumption and high performance for storage. In fact, it is being shown publicly for the first time how Phison’s E19T controller that offers low power consumption for main steam drives and at the same time promises to deliver best in class power savings while reducing cooling needs in data centers. That is, of course, a phenomenal achievement. In addition to that Phison is offering a preview of the company’s next-generation gen 4 by 4 PS 501 851118 controller this is an optimized design that it gives high-performance advantages because of PCI e 4.0 interface. This enables the company to how to mark a new achievement it in performance leadership in gen 4 SSD.

Phison Elevates Scale in SSD Solutions

That is not all the company will also showcase for the first time its PS5013-E13T 1113 BGA SSD at FMS. With this, the customer gets all kind of advantages of flash technologies in ultra-thin and ultra-compact 1113 BGA form factor. The E13T BGA SSD can perform up to 1.7 GB per second sequential read and 1.1 GB per second sequential write while consuming only 1.5 watts. That ensures a prolonged battery life of any embedded solution. At the same booth, there will be more demonstrations from Phison’s Technology partners Liqid and Cigent Technology Inc. Liqid will be showcasing its ultra-high-performance Gen4 NVMe full length and full height add-in card model LQD4500 powered by Phison’s E16 controller. It is capable of 5 million IOPS and 24 GB per second throughput. The card is available up to 32 TB of capacity.

On the other hand, Cigent will demonstrate it’s Dynamic Data Defense Engine (D3E™) for Windows 10. D3 when paired with Phison’s E12-based SSD based helps to prevent the exfiltration of sensitive data as soon as a system gets compromised. As a matter of fact, s Phison E12 allows D3E to support “on the fly” firmware based folder locking which can only be allowed to access with a higher level authentication the moment its threat level is elevated.

Phison Electronics

K.S. Pua, CEO, Phison Electronics says, “Whether in the audience at one of our speaker presentations or stopping by our booth for a demonstration of our next-generation technologies, FMS attendees will have an excellent opportunity to learn how Phison is leading the way in delivering high-performance solutions that meet the ever-increasing needs of the data storage market. FMS is the ideal setting for us to demonstrate this leadership, as well as the perfect venue to publicly show our E19 for the first time.  We look forward to a great show.”


July 30, 2019  11:33 PM

Storage Economics Sets A New Benchmark with S1aaS @StorOne_Inc

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Backup storage, Data storage software, Dell, FPGA, Mellanox, Storage

Storage Economics gets a new paradigm shift soon after the announcement of S1-as-a-Service (S1aaS) from StorOne. What is means is a comprehensive enterprise storage solution is now feasible at a very cost-effective subscription model that never existed so far. Neither had it been imagined or initiated by any other venture so far. StorOne’s innovation behind the S1 storage software platform brings S1aaS. S1aaS, in simple terms, can be explained as a use-based solution integration enterprise-class S1 storage service with Dell Technologies and Mellanox hardware. This integrated solution showcases the next level of storage economics where the industry can define its price point. This transformation delivers a perfect balance between enterprise-level performance and data protection capabilities. It also provides the best answer to a long-pending question hovering around without a solution. This refers to proper resource utilization that StorOne has overcome beautifully balancing security risks and performance impacts.

Storage Economics

These security risks and performance impacts are usually deeply connected with the cloud that on one hand provides the reliability and capabilities as good as that of an on-premise model and on the other hand guarantee you only pay for what you need to utilize to support your business requirements. And, in fact, all this happens flawlessly and in a very transparent manner. This is one of the best prepositions for the customer who gets the best of both worlds. On one hand, the customer enjoys cloud-like simplicity and on the other hand the flexibility in the pricing in such a manner as listed above. That is the environment remains as that of a cloud-based model and the performance and control is like an on-premise infrastructure.

Storage Economics Achieves a New Landmark

The pricing of this S1aaS model starts at $999 per month for an 18 terabyte (TB) of an all-flash array that performs up to 150,000 IOPS. This, as a matter of fact, is the most flexible model with customer definable pricing along with best of the capacity and performance capabilities available in the market.

Gal Naor, CEO, and co-founder of StorONE says, “S1aaS is going to change the economics not only of storage but of the entire data center. S1aaS makes enterprise-class all-flash array performance and data protection and control available for only $999 per month. No other vendor can offer a complete storage solution – whether on-premises or in the cloud – for this low of a monthly cost.”

George Crump, Founder, and Lead Analyst, Storage Switzerland says, “As high-performance, SAS and NVMe flash drives become commonplace in the data center, storage media is no longer the bottleneck to performance. The storage management layer is a problem. Vendors try to compensate by using more powerful processors, more RAM, custom FPGAs, and ASICs, as well as spreading I/O across dozens of flash drives, whose capacity is not needed. StorONE’s focus on efficiency – 150K IOPS from four conventional drives, an industry-defining capability – is the foundational component of S1aaS. It enables the democratization of storage performance previously unavailable to the data center.”

Motti Beck, Senior Director Enterprise Market Development, Mellanox says “Advanced storage solutions like this require high-performance, programmable and intelligent networks. The combination of StorONE’s S1 software and Mellanox Ethernet Storage Fabric solutions eliminate the traditional bottlenecks that have been associated with the server to storage communication and supports critical storage features, which improves data center efficiency and ensures the best user experience available.”

Further information about S1aaS is available at http://bit.ly/2JWBb56


July 24, 2019  11:44 PM

NVIDIA Partner Network Strengthens With the entry of @SwiftStack @NVIDIA

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Artificial intelligence, Cloud data storage, Machine learning, NVIDIA, Public Cloud, SwiftStack

According to the latest release, Swiftstack joins NVIDIA Partner Network. It means a lot to Major industries like automobile, healthcare, and telecom to name a few. Now, Autonomous Vehicles, Telecom, and Healthcare can leverage large-scale artificial intelligence and machine learning data pipelines from edge to core to cloud. That covers a complete spectrum, in fact. SwiftStack, as we all know, is the market leader in multi-cloud data storage and management. The company announces its entry to NVIDIA Partner Network (NPN) program as a key solution provider for latest technologies like artificial intelligence and machine learning use cases. SwiftStack uses NVIDIA DGX-1 systems and NGC container registry of GPU-optimized software in its latest state-of-the-art storage solution. The solution covers large-scale Artificial Intelligence (AI) and Machine Learning (ML) along with deep learning workflows that span across edge-to-core-to-cloud data pipelines for the use cases mentioned above.

NVIDIA Partner Network

As a matter of fact, the NPN Solution Advisor Program empowers NVIDIA customers with full access to world-class solution experts having deep knowledge of enterprise-wide integration with NVIDIA DGX-1 clusters. To add further value to it, SwiftStack’s AI/ML solution has the power to deliver massive storage parallelism and throughput to NVIDIA GPU compute and NGC. The use cases cover a wide range including data ingest, training and inferencing, data services, etc to support any kind of AI/ML workflows. On top of it, the SwiftStack 1space solution extends to the public cloud so that the customers can benefit from cloud-bursting and economies of scale. At the same time, the data stays secured on-premises.

SwiftStack joins NVIDIA Partner Network

Amita Potnis, Research Director at IDC’s Infrastructure Systems, Platforms and Technologies Group says, “ Infrastructure challenges are the primary inhibitor for broader adoption of AI/ML workflows. SwiftStack’s multi-cloud data management solution is the first of its kind in the industry and effectively handles storage I/O challenges faced by the edge to core to cloud, large-scale AI/ML data pipelines.”

NVIDIA Partner Network

Shailesh Manjrekar, Head of AI/ML Solutions Marketing and Corporate Development at SwiftStack says, “ The SwiftStack solution accelerates data pipelines, eliminates storage silos, and enables multi-cloud workflows, thus delivering faster business outcomes. Joining NVIDIA’s Partner Network program builds upon the success we are seeing with large-scale AI/ML data pipeline customers and endorses our value to these environments.”

Craig Weinstein, Vice President, Americas Partner Organization at NVIDIA says, “ NVIDIA AI solutions are used across transportation, healthcare and telecommunication industries. Our high-performance computing platform needs fast storage and SwiftStack brings on-premises, scale-out, and geographically distributed storage that makes them a good fit for our NPN Solution Advisor Program.”


July 21, 2019  11:33 PM

Protect Your Physical, Virtual, and Cloud With NAKIVO v9 @Nakivo

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Data Recovery, Deduplication, Nakivo, VM backup

With the release of NAKIVO v9, your support for physical windows server backup is here. As a matter of fact, NAKIVO Backup & Replication v9 now provides 100% protection for physical, virtual, and cloud environments. NAKIVO Inc. has proven its mettle in a very short span. It is one of the fastest-growing software company with a sole aim to provide enterprise solutions to protect virtual and cloud environments. With this announcement, NAKIVO creates a new landmark in physical, virtual, and cloud environments. The new version, i.e. v9, adds support for Microsoft Windows Server backup thereby empowering its customers to safeguard physical, virtual, and cloud environments from a single point. That is an extremely useful feature for an enterprise from system upkeep point of view. There are certain key features of the new release. For instance, it supports the application-consistent Backup.

What we mean by Application-consistent Backup is that the new release NAKIVO Backup & Replication v9 can very well take care of incremental, application-aware backups of physical Microsoft Windows Servers. That means the solution now provides application-consistent backups of business-critical applications including databases running on physical Windows Servers. This includes different variants of Microsoft Servers viz Microsoft Exchange, SQL, Active Directory, and Sharepoint, and also Oracle. Another key feature is the Global Data Deduplication. Now, the backups of physical servers can be stored in a regular backup repository. This can easily be done along with backups of VMs and AWS EC2 instances. All these backups being stored in a backup repository can, in turn, be automatically deduplicated irrespective of the platform. This ensures only unique data blocks being saved. This results in a tremendous saving of storage space used by physical machine backups.

NAKIVO Backup & Replication v9 also means Instant Granular Recovery. This means now you can instantly recover files, folders, or Microsoft application objects (any Microsoft server) directly from the earlier created deduplicated physical machine backups. In addition, customers can take help from the Universal Application Object Recovery feature when an instant recovery of objects from any other applications is required. The new version supports Physical to Virtual (P2V). Besides instantly recovering files, folders, and objects from physical servers backups, enterprises can also restore physical Windows Server backups to VMWare and Hyper-V VMs. While the new version tackles so many complexities, the pricing model is quite simple. It starts at $17 per machine/year which is probably the most cost-effective per machine subscription model. A single per-machine license means either of a VMWare VM, Hyper-V VM, a physical machine, Nutanix AHV VM, or AWS EC2 instance.

This provides customers a high level of flexibility and least dependence on different vendors. The customer can now easily remove vendor locks in order to move their workloads between various platforms without any need of changing their data protection licensing. Bruce Talley, CEO, NAKIVO Inc. says, “NAKIVO Backup & Replication v9 enables our customers to not only protect their business-critical workloads across virtual and cloud environments but now also physical Windows Server systems. Now our customers who have physical or mixed environments can protect their critical business data from a single pane of glass.”

RESOURCES
Trial Download: www.nakivo.com/resources/download/trial-download/
Success Stories: www.nakivo.com/customers/success-stories/
Datasheet: www.nakivo.com/res/files/nakivo-backup-replication-datasheet.pdf


July 14, 2019  12:10 AM

1st Data Orchestration Platform with Multi-cloud Analytics AI @Alluxio

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Cloud analytics, data orchestration

Alluxio is a well-known name among the world’s top internet companies. In fact, 7 out of the top 10 internet companies use open-source data orchestration technology developed by Alluxio. The launch of Alluxio 2.0 at the recent AWS Summit in New York announces a lot more power in it. The open-source and enterprise edition Alluxio 2.0 simplifies and accelerates multi-cloud, data-hungry workload adoption and deployment. As a matter of fact, Alluxio 2.0, as a result, brings breakthrough innovations for data engineers who are responsible for managing and deploying analytical and AI workloads in the cloud.

The solution works equally well in hybrid and multi-cloud environments. There is a tremendous demand in compute workloads across the globe. Adoption of the cloud has triggered this requirement multifold in an exponential manner. Organizations are adopting a decoupled architecture for modern workloads in which compute scales independently from storage. That brings in new data engineering problems.

Data Orchestration

The new paradigm definitely enables scaling elasticity. But at the same time, it scales up new data engineering problems. That arises a need for an abstraction layer with an immediate effect. The way compute and containers happen well with Kubernetes, data needs orchestration so badly with an increase in data silos. Data Orchestration will not only bring data locality but also enable data accessibility and data elasticity to compute across data silos. Those silos include different zones, regions, and even clouds. That is where Alluxio 2.0 Community Edition and Enterprise Edition comes in the picture. The two editions ensure new capabilities across all known critical segments that are causing gaps in today’s cloud data engineering market. Alluxio 2.0 is a true example of breakthrough data orchestration innovation for multi-cloud. It ensures policy-driven data management, improved administration, cross-cloud efficient data services, focussed compute, and integration.

Data Orchestration

Haoyuan Li, Founder and CTO, Alluxio says, “With a data orchestration platform in place, a data analyst or scientist can work under the assumption that the data will be readily accessible regardless of where the data resides or the characteristics of the storage. They can focus on building data-driven analytical and AI applications to create values, without worrying about the environment and vendor lock-in. These new advancements to Alluxio’s data orchestration platform further cement our commitment to a cloud-native, open-source approach to enabling applications to be compute, storage and cloud agnostic.”

Mike Leone, Analyst, ESG says, “Data is only as useful as the insights derived from it and with organizations trying to analyze as much data as possible to gain a competitive edge, it’s challenging to find useful data that’s spread across globally-distributed silos. This data is being requested by various compute frameworks, as well as different types of users hoping to gain actionable insight. These multiple layers of complexity are driving the need for a solution to improve the process of making the most valuable data accessible to compute at the speed of innovation. Alluxio has identified an important missing piece that makes data more local and easily accessible to data-powered compute frameworks regardless of where the data resides or the characteristics of the underlying storage systems and clouds.”

Data Orchestration

Steven Mih, CEO, Alluxio says, “Whether by design or by departmental necessity, companies are facing an explosion of data that is spread across hybrid and multi-cloud environments. To maintain a competitive advantage, speed and depth of insight have become the requirement. Data-driven analytics that was once run over many hours, now need to be done in seconds. AI/ML models need to be trained against larger-and-larger datasets. This all points to the necessity of a data tier which orchestrates the movement and policy-driven access of a companies’ data, wherever it may be stored. Alluxio abstracts the storage and enables a self-service culture within today’s data-driven company.”

Both Alluxio 2.0 Community and Enterprise Edition are now generally available for download via tarball, docker, brew, etc.

Resources

Alluxio 2.0 release page – https://www.alluxio.io/
Download Alluxio 2.0 – https://www.alluxio.io/
Founder blog – https://www.alluxio.io/blog/
Product blog – https://www.alluxio.io/blog/2-


July 13, 2019  11:12 PM

Tachyum Inc’s 64-core processor cuts processor power by 10x @Tachyum

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Artificial intelligence, Data Center, processor

Tachyum is a combination of two Greek words collectively symbolizing ‘an element of speed’. The company fits best to it. And it is continuously craving to beat its own records in terms of bringing a better product every time. Every time it brings a new product, that is first in the world. I think the top management of Tachyum is firm on keeping the company creating new landmarks for others to follow. The recent news is a good example of it. Tachyum is bringing $25 million in Series A financing. Rado Danilak, CEO, Tachyum has a couple of significant laurels to his credit. His last two companies were bought by Western Digital (WD) and LSI/SanDisk. He carries more than 100 patents to his credit. These patents are already in production. Rado has a deep knowledge of the semiconductor market in general.

Tachyum

Anybody dealing in chips and processors would be conversant with Prodigy Processor Universal Chip from Tachysum. It is the smallest and fasters general purpose, 64-core processor developed to date across the globe. The money mentioned is being used for further enhancement of the Prodigy Processor Universal Chip. It requires 10x less processor power than the chips produced by its nearest competitive products in the market from Intel, NVIDIA, and AMD. Another huge disruption it brings is a cost reduction of 3x. These two factors are more than enough to shake the existing markets and redefine the leadership chart in these segments. The development matches well with the AI revolution that demands machines more powerful than the human brain. The ultimate goal is to deliver AI for Good and AI for All. Prodigy has enormous strength. It reduces data center annual total cost of ownership (TCO) by 4x.

Tachyum brings a new revolution in AI with Prodigy

Prodigy from Tachyum is a sheer example of disruptive hardware architecture and a smart compiler. In fact, this new design has made many parts of the existing hardware in a typical processor redundant. The core has become simpler and smaller. The wires have become fewer and shorter. All this results in greater speed and power efficiency for the processor. That is what Prodigy Universal Processor Chip is. It is an ultra-low-power processor having a capability of enabling an Exaflop supercomputer using around 250,000 Prodigy processors.

Tachyum

Adrian Vycital, Managing Partner at IPM Group, Tachyum’s lead investor based in London and Bratislava says, “The work that Tachyum is doing is highly disruptive and will lead to dramatic improvements in burgeoning markets of artificial intelligence and high-performance computing that require extreme processing speeds and power efficiencies. Supporting Tachyum at this stage of their development provides cascading opportunities for unprecedented success, helping them to establish themselves as the leader in what truly is the future of computing.”

Dr. Radoslav Danilak, Co-founder and CEO of Tachyum says, “We are extremely pleased to announce another infusion of working capital into Tachyum, which not only enables us to complete our mission of delivering disruptive products to market but also represents well-reasoned confidence in our approach to overcoming challenges faced by the industry. The ability to change the world takes more than one man’s vision. Having an investment community backing Tachyum allows us to properly build a world-class organization with the best and brightest talent available. We look forward to growing the company and the industry atop the foundation that we’ve already built.”

You can visit the official website here: http://www.tachyum.com


July 10, 2019  9:58 PM

How Safe Is Your Enterprise Backup Data from Malware Attack? @asigra

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Asigra, Backup and restore, Backup Recovery and Media Services, Cloud Backup, Enterprise Backup, malware

How many CIOs and CTOs can surely claim that their enterprise backup that is being taken regularly is not contaminated with any kind of Malware. A recent report published by DCIG features cybersecurity approaches from the legends in the field like Asigra, Rubrik, and Dell. Detecting and preventing malware in your enterprise backup environments is as critical as in the production environments. Unless and until you have sufficient knowledge and tools to detect those, it is impossible for you to respond and prevent your enterprise data well within safe limits. Asigra Inc. is a pioneer in cloud backup, recovery, and restore solutions since 1986. Just now it announced that the Data Center Infrastructure Group (DCIG) has come out with an important report titled “Creating a Secondary Perimeter to Detect Malware in Your Enterprise Backup Environment.” I think, the report is important to read by all CIOs and CTOs.

enterprise backup

source: asigra.com

CTOs and CIOs must read this report in order to understand the threats and vulnerabilities in this regard they are living in. The report gives quite a number of useful insights. It presents a comparison of three approaches that any enterprise can use to detect and prevent malware attacks on backup data. The report further analyses which approach may be the most effective for enterprise backup environments. It is the purity of your backup sets that creates confidence in recovering lost data when the need arises. Therefore it is the foremost point of importance to understand that. As a matter of fact, enterprises are now understanding the importance of managing the threat that malware offers to backup data in today’s high-risk environments. A successful recovery completely depends on how pure a confirmed reproductive set of enterprise backup is. That set, in fact, should be completely free form malware.

Enterprise Backup Data

The three methodologies that the DCIG report talks about creating a golden copy of enterprise backup data are like this. First one talks about the inline scan which means all incoming and restored backup data must be actively scanned for malware in real-time. The second method recommends using a sandbox approach in which no scan happens while creating a backup set but an IT sandbox is set separately to recover this data and test it thoroughly for malware. Finally, the third method it talks about is Snapshot analysis in which snapshots of production data are captured and analyzed thoroughly. It is the result of the analysis will decide which set is infected with malware. Obviously, out of these three, the most appropriate method is the inline scan of backup and recovery data.

Enterprise Backup

Source: Asigra.com

As DCIG states, “Inline scans represent the easiest and fastest way for a company to scan its backup data for the presence of known strains of malware as well as position the company to scan recovered data for yet unknown malware signatures.” That is where the Asigra enterprise backup solution comes in the picture as a top contender. The report suggests Asigra Cloud Backup V14 as an optimum solution for inline scanning of malware.

Jerome Wendt, Founder, and President, DCIG says, “The products that Asigra, Dell EMC, and Rubrik offer, and the respective techniques they use to detect the presence of malware in backup repositories, represent the primary methodologies that backup software employs. Of these three, only Asigra and Rubrik provide a company with the means to automate and simplify the process to detect malware in backups. Of those two, only Asigra currently makes cybersecurity software available as an optional feature that a company can turn on.”

Enterprise Backup Solution

Eran Farajun, Executive Vice President, Asigra says, “ Asigra Cloud Backup V14 converges enterprise data protection and cybersecurity, embedding malware engines in the backup and recovery streams to prevent ransomware from impacting the business. Asigra identifies any infecting malware strains, quarantines them, then notifies the customer. It is a very comprehensive data protection solution, built from the ground up for distributed IT environments.”

You can download the free DCIG report here: http://library.asigra.com/dcig-report


July 10, 2019  5:17 PM

@SwiftStack Enables @dcBLOXinc To Deliver Multi-Region Cloud Storage

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Cloud storage, SwiftStack

DC BLOX is a multi-tenant data center provider in Southeastern U.S. The company designs and manages highly secured & reliable data centers for almost all segments of clients viz government and education, enterprise, healthcare, content providers, life sciences, and managed service providers. That is a huge spectrum they serve to. You will mind many of their state-of-the-art data centers in traditionally underserved markets. This way they are able to provide affordable business-class cloud storage and colocation services along with a private high-performance network in the Southeast with an aim to guarantee business continuity The company also supports hybrid IT environments with the least upfront capital investment without a compromise with an iota of quality. Recently, DC BLOX selected SwiftStack to deliver a large scale multi-region cloud storage service for Southeastern U.S. That is a huge volume to cater to with a seamless service.

SwiftStack is a market leader in multi-cloud storage and its management. DC BLOX decides to deploy SwiftStack software in order to boost its own multi-region, hyperscale cloud storage service for its large customer base for their business continuity (BC) and disaster recovery (DR) needs. As the business grows, it results in higher customer expectations. To meet those expectations, the service provider needs a foolproof stable system in place. DC BLOX was facing a tremendous demand for affordable secondary storage services from its existing customers. At the same time, it was having a high pressure to scale up its business level to expand to more regions. That resulted in an immediate requirement of a new storage platform that could seamlessly perform, scale-up, and cater to multiple geographic regions. There was also an intense requirement of support traditional and cloud-native applications.

Seamless Multi-Region Cloud Storage

Among many options of object and file storage from the leading vendor, DC BLOX found SwiftStack most suitable after a comprehensive evaluation process. Thus SwiftStack was given a nod to provide a turnkey platform for DX BLOX Cloud Storage. With the help of SwiftStack, DC BLOX is able to create a multi-region cluster that currently is at three locations with the fourth one coming shortly. And 15 more to come within a stipulated timeframe in a well-planned manner.

Chris Gatch, CTO at DC BLOX says, “ SwiftStack helped us reduce the cost of storing and utilizing data, based on a comparison with other choices we considered, including the ability to manage more data with a smaller headcount. Along with savings at scale, we are able to offer innovative data services for a more compelling, more competitive solution.”

Erik Pounds, Vice President of Marketing at SwiftStack says, “ DC BLOX offers a cloud solution that addresses the needs of the business communities they serve, and also has unique differentiators to let them compete with global public cloud providers. Giving its customers both object and file access to data ensures cloud storage is compatible with their users’ modern and legacy applications, which is a fairly unique feature compared to what is available from big cloud vendors.”


July 8, 2019  7:27 PM

Content Guru Creates A New Landmark at CCW Las Vegas @cgchirp

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Customer engagement, customer experience, Las Vegas, Rakuten

Customer Contact Week (CCW) in Las Vegas this week becomes a major collaboration moment for two experts in their respective fields. One is Content Guru which is a global frontrunner in large volume cloud-based contact center technology. On the other hand is Rakuten Inc., the Japanese electronic commerce company. The two global leaders joined forces at CCW to demonstrate how effective intelligent automation is when it comes to delivering exemplary Customer Engagement and Experience. Of course, excelling in customer engagement would be a matter of concern for any organization across the globe. As a matter of fact, this is the 20th anniversary of CCW this year. CCW or Customer Contact Week that was from June 24 to June 28, 2019 at The Mirage Hotel in Las Vegas in world’s most significant conference and expo for Customer Experience (CX), contact center, and customer care professionals.

Content Guru

Content Guru was at stand #1102 at the event. The theme for Content Guru at the event was to showcase various major achievements for its customers in terms of customer experience. Rakuten is also known as the ‘Amazon of Japan’. The story of the relationship between Content Guru and Rakuten is quite interesting. Their journey together began at CCW Las Vegas in 2017. Rakuten highlighted how the company successfully used Content Guru’s cloud-based storm® platform that helped them transform their customer’s experience. After this deployment, the customers of Rakuten experience a greater experience when they contact the company or the thousands of sellers that use Rakuten platform to sell their products. In fact, Content Guru hosted a workshop on Tuesday, June 25 that was titled “Next Generation Omni-Channel Contact Center: AI, NLP, Web Chat & Chatbots”. The workshop was led by Martin Taylor, Deputy CEO, Content Guru.

Content Guru Creates A New Landmark in Customer Experience

By means of conducting this workshop, Content Guru showcased how they utilize intelligent automation and Artificial Intelligence through their state of the art storm® platform that helps their customers to deliver high-quality and ultimate customer service. Martin Taylor says, “Content Guru’s partnership with Rakuten originally began from conversations at CCW Las Vegas, so this event will always have a special place in our hearts. The quality of customer service has become a crucial differentiator between businesses. The CCW conference allows us to put front-and-center Content Guru’s vision to place organizations head-and-shoulders above their competition by providing the best Customer Engagement and Experience.”

You might like to see how storm works. Have a look at the insightful video below:


July 2, 2019  4:01 PM

Why You Should Replace or Enhance Your Legacy VPN with a Software-Defined Perimeter (SDP) Solution

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
DH2i, SDP, VPN

As has been the case for many decades, innovative new applications are entering the marketplace on a regular basis to support the ever-changing way we do business, interact with customers, and interact with each other. While developments in areas such as cloud, AI, machine learning, IoT, edge, mobile and big data to name just a few, bring with them undeniable and highly desirable benefits, they can also introduce problems. Certainly, one of the biggest pain points for many organizations is how to ensure the protection and security of data. To not do so can mean not only serious detriment to the long-term success of your business but can also carry serious legal and regulations compliance ramifications.

Adding to the problem is that many IT professionals have come to rely upon and trust virtual private networks (VPNs) to deliver the level of security they require. And, this makes sense. For a very long time, they did indeed deliver the required security protection. Unfortunately, as it stands today VPNs have not evolved to support today’s application protection and security requirements. At least, not by themselves.

I recently spoke with Don Boxley, CEO, and Co-Founder of DH2i on this subject. He describes VPNs as taking a “castle and moat” approach to security, where the VPN serves as the drawbridge. This painted a very understandable picture as to why VPNs are unable to meet today’s new business and IT realities. He explained that via this approach, organizations are more vulnerable to compromised devices and networks, excessive network access by non-privileged users, credential theft and other security issues. From a non-security specific standpoint, the VPN introduces complex manual set-up and maintenance, slow and unreliable connections, and an inability to scale efficiently and cost-effectively.

We then talked about a relatively new approach to not necessarily replace a VPN (although, I would argue it could), but to dramatically enhance it – a software-defined perimeter (SDP) solution. SDPs offer an ideal new approach for connectivity security. SDP tackles legacy VPN, cloud-native and privileged user access security issues. Designed specifically to support today’s DevOps, IoT, containers, edge, and other workloads, with the inherent flexibility to be tailored to support future yet to be introduced application/workload requirements, SDP not only delivers considerably improved security but increased performance speed as well.

DH2i has announced a new SDP solution, called DxConnect. DxConnect is a network security software designed to enable developers and network admins to build an integrated zero trust (ZT) connectivity security infrastructure for cloud-native applications, hybrid/multi-cloud connectivity and privileged user access without using a VPN. If you are interested in securing your organization’s data, and you wish to replace or enhance the capabilities of your VPN, you can learn more about DH2i’s new software here: http://dh2i.com/dxconnect/.


June 30, 2019  2:35 PM

Data center virtualization A Complete Guide – 2019 Edition @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Data Center Virtualization

Data center virtualization A Complete Guide – 2019 Edition by Gerardus Blokdyk

Link: https://www.amazon.com/Data-center-virtualization-Complete-Guide-ebook/dp/B07T9QJSP7/ref=sr_1_22?keywords=virtualization&qid=1561884991&s=books&sr=1-22

Excerpt from Amazon.com

What is your risk of UPS failure? Should san and tape storage solutions be included in the asset inventory baseline? How does your organization further reduce operating costs at your data center? How to remote control, monitor and maintain the system? How much cooling infrastructure is necessary to cool the server environment adequately?

Defining, designing, creating, and implementing a process to solve a challenge or meet an objective is the most valuable role… In EVERY group, company, organization and department.

Unless you are talking a one-time, single-use project, there should be a process. Whether that process is managed and implemented by humans, AI, or a combination of the two, it needs to be designed by someone with a complex enough perspective to ask the right questions. Someone capable of asking the right questions and step back and say, ‘What are we really trying to accomplish here? And is there a different way to look at it?’

This Self-Assessment empowers people to do just that – whether their title is entrepreneur, manager, consultant, (Vice-)President, CxO etc… – they are the people who rule the future. They are the person who asks the right questions to make Data center virtualization investments work better.

This Data center virtualization All-Inclusive Self-Assessment enables You to be that person.

All the tools you need to an in-depth Data center virtualization Self-Assessment. Featuring 933 new and updated case-based questions, organized into seven core areas of process design, this Self-Assessment will help you identify areas in which Data center virtualization improvements can be made.

In using the questions you will be better able to:

– diagnose Data center virtualization projects, initiatives, organizations, businesses and processes using accepted diagnostic standards and practices

– implement evidence-based best practice strategies aligned with overall goals

– integrate recent advances in Data center virtualization and process design strategies into practice according to best practice guidelines

Using a Self-Assessment tool known as the Data center virtualization Scorecard, you will develop a clear picture of which Data center virtualization areas need attention.


June 30, 2019  2:30 PM

Server Virtualization A Complete Guide – 2019 Edition @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Virtualization

Server Virtualization A Complete Guide – 2019 Edition by Gerardus Blokdyk

Link: https://www.amazon.com/Server-Virtualization-Complete-Guide-2019/dp/0655805613/ref=sr_1_18?keywords=virtualization&qid=1561884991&s=books&sr=1-18

Excerpt as on Amazon.com:

What proportion of your data centers pursue energy savings through server virtualization? What proportion of data centers pursue energy savings through server virtualization? What about incompatibilities between two applications installed on the same instance of an operating system? what about mid-size companies? How will server virtualization technology evolve?

Defining, designing, creating, and implementing a process to solve a challenge or meet an objective is the most valuable role… In EVERY group, company, organization and department.

Unless you are talking a one-time, single-use project, there should be a process. Whether that process is managed and implemented by humans, AI, or a combination of the two, it needs to be designed by someone with a complex enough perspective to ask the right questions. Someone capable of asking the right questions and step back and say, ‘What are we really trying to accomplish here? And is there a different way to look at it?’

This Self-Assessment empowers people to do just that – whether their title is entrepreneur, manager, consultant, (Vice-)President, CxO etc… – they are the people who rule the future. They are the person who asks the right questions to make Server Virtualization investments work better.

This Server Virtualization All-Inclusive Self-Assessment enables You to be that person.

All the tools you need to an in-depth Server Virtualization Self-Assessment. Featuring 955 new and updated case-based questions, organized into seven core areas of process design, this Self-Assessment will help you identify areas in which Server Virtualization improvements can be made.

In using the questions you will be better able to:

– diagnose Server Virtualization projects, initiatives, organizations, businesses and processes using accepted diagnostic standards and practices

– implement evidence-based best practice strategies aligned with overall goals

– integrate recent advances in Server Virtualization and process design strategies into practice according to best practice guidelines

Using a Self-Assessment tool known as the Server Virtualization Scorecard, you will develop a clear picture of which Server Virtualization areas need attention.

Your purchase includes access details to the Server Virtualization self-assessment dashboard download which gives you your dynamically prioritized projects-ready tool and shows your organization exactly what to do next. You will receive the following contents with New and Updated specific criteria:

– The latest quick edition of the book in PDF

– The latest complete edition of the book in PDF, which criteria correspond to the criteria in…

– The Self-Assessment Excel Dashboard

– Example pre-filled Self-Assessment Excel Dashboard to get familiar with results generation

– In-depth and specific Server Virtualization Checklists

– Project management checklists and templates to assist with implementation

INCLUDES LIFETIME SELF ASSESSMENT UPDATES

Every self assessment comes with Lifetime Updates and Lifetime Free Updated Books. Lifetime Updates is an industry-first feature which allows you to receive verified self assessment updates, ensuring you always have the most accurate information at your fingertips.


June 30, 2019  2:26 PM

Global Software Engineering: Virtualization and Coordination @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Software engineering, Virtualization

Global Software Engineering: Virtualization and Coordination (Applied Software Engineering Series) by Gamel O. Wiredu

Link: https://www.amazon.com/Global-Software-Engineering-Virtualization-Coordination-ebook/dp/B07TMG3LYM/ref=sr_1_16?keywords=virtualization&qid=1561883567&s=books&sr=1-16

Excerpt as on Amazon:

Technology and organizations co-evolve, as is illustrated by the growth of information and communication technology (ICT) and global software engineering (GSE). Technology has enabled the development of innovations in GSE. The literature on GSE has emphasized the role of the organization at the expense of technology. This book explores the role of technology in the evolution of globally distributed software engineering.

To date, the role of the organization has been examined in coordinating GSE activities because of the prevalence of the logic of rationality (i.e., the efficiency ethos, mechanical methods, and mathematical analysis) and indeterminacy (i.e., the effectiveness ethos, natural methods, and functional analysis). This logic neglects the coordination role of ICT. However, GSE itself is an organizational mode that is technology-begotten, technology-dominated, and technology-driven, as is its coordination. GSE is a direct reflection of ICT innovation, change, and use, yet research into the role technology of GSE has been neglected.

Global Software Engineering: Virtualization and Coordination considers existing fragmented explanations and perspectives in GSE research, poses new questions about GSE, and proposes a framework based on the logic of virtuality (i.e., creativity ethos, electrical methods, and technological analysis) rather than of rationality and indeterminacy. Virtuality is the primary perspective in this book’s comprehensive study of GSE. The book concludes with an integrated explanation of GSE coordination made possible through ICT connectivity and capitalization.


June 30, 2019  2:23 PM

Docker in Action 2nd Edition @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Docker

Docker in Action 2nd Edition by Jeff Nickoloff and Stephen Kuenzli

Link: https://www.amazon.com/Docker-Action-Jeff-Nickoloff/dp/1617294764/ref=sr_1_10?keywords=virtualization&qid=1561883567&s=books&sr=1-10

Excerpt as on Amazon:

Even small applications have dozens of components. Large applications may have thousands, which makes them challenging to install, maintain, and remove. Docker bundles all application components into a package called a container that keeps things tidy and helps manage any dependencies on other applications or infrastructure.

Docker in Action, Second Edition teaches you the skills and knowledge you need to create, deploy, and manage applications hosted in Docker containers. This bestseller has been fully updated with new examples, best practices, and entirely new chapters. You’ll start with a clear explanation of the Docker model and learn how to package applications in containers, including techniques for testing and distributing applications.

Purchase of the print book includes a free eBook in PDF, Kindle, and ePub formats from Manning Publications.


June 30, 2019  2:20 PM

5G Physical Layer Technologies (Wiley – IEEE) @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
5G

5G Physical Layer Technologies (Wiley – IEEE) by Mosa Ali Abu-Rgheff

Link: https://www.amazon.com/5G-Physical-Layer-Technologies-Wiley/dp/1119525519/ref=sr_1_8?keywords=virtualization&qid=1561883567&s=books&sr=1-8

Excerpt as on Amazon:

Written in a clear and concise manner, this book presents readers with an in-depth discussion of the 5G technologies that will help move society beyond its current capabilities. It perfectly illustrates how the technology itself will benefit both individual consumers and industry as the world heads towards a more connected state of being. Every technological application presented is modeled in a schematic diagram and is considered in depth through mathematical analysis and performance assessment. Furthermore, published simulation data and measurements are checked.

Each chapter of 5G Physical Layer Technologies contains texts, mathematical analysis, and applications supported by figures, graphs, data tables, appendices, and a list of up to date references, along with an executive summary of the key issues. Topics covered include: the evolution of wireless communications; full duplex communications and full dimension MIMO technologies; network virtualization and wireless energy harvesting; Internet of Things and smart cities; and millimeter wave massive MIMO technology. Additional chapters look at millimeter wave propagation losses caused by atmospheric gases, rain, snow, building materials and vegetation; wireless channel modeling and array mutual coupling; massive array configurations and 3D channel modeling; massive MIMO channel estimation schemes and channel reciprocity; 3D beamforming technologies; and linear precoding strategies for multiuser massive MIMO systems. Other features include:

In depth coverage of a hot topic soon to become the backbone of IoT connecting devices, machines, and vehicles
Addresses the need for green communications for the 21st century
Provides a comprehensive support for the advanced mathematics exploited in the book by including appendices and worked examples
Contributions from the EU research programmes, the International telecommunications companies, and the International standards institutions (ITU; 3GPP; ETSI) are covered in depth
Includes numerous tables and illustrations to aid the reader
Fills the gap in the current literature where technologies are not explained in depth or omitted altogether

5G Physical Layer Technologies is an essential resource for undergraduate and postgraduate courses on wireless communications and technology. It is also an excellent source of information for design engineers, research and development engineers, the private-public research community, university research academics, undergraduate and postgraduate students, technical managers, service providers, and all professionals involved in the communications and technology industry.


June 30, 2019  2:17 PM

Microsoft Azure Infrastructure Services for Architects @Amazon

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Azure

Microsoft Azure Infrastructure Services for Architects:: Designing Cloud Solutions by John Savill

Link: https://www.amazon.com/Microsoft-Azure-Infrastructure-Services-Architects/dp/1119596572/ref=sr_1_6?keywords=virtualization&qid=1561883567&s=books&sr=1-6

Excerpt as on Amazon.com

An expert guide for IT administrators needing to create and manage a public cloud and virtual network using Microsoft Azure

With Microsoft Azure challenging Amazon Web Services (AWS) for market share, there has been no better time for IT professionals to broaden and expand their knowledge of Microsoft’s flagship virtualization and cloud computing service. Microsoft Azure Infrastructure Services for Architects: Designing Cloud Solutions helps readers develop the skills required to understand the capabilities of Microsoft Azure for Infrastructure Services and implement a public cloud to achieve full virtualization of data, both on and off premise. Microsoft Azure provides granular control in choosing core infrastructure components, enabling IT administrators to deploy new Windows Server and Linux virtual machines, adjust usage as requirements change, and scale to meet the infrastructure needs of their entire organization.

This accurate, authoritative book covers topics including IaaS cost and options, customizing VM storage, enabling external connectivity to Azure virtual machines, extending Azure Active Directory, replicating and backing up to Azure, disaster recovery, and much more. New users and experienced professionals alike will:

Get expert guidance on understanding, evaluating, deploying, and maintaining Microsoft Azure environments from Microsoft MVP and technical specialist John Savill
Develop the skills to set up cloud-based virtual machines, deploy web servers, configure hosted data stores, and use other key Azure technologies
Understand how to design and implement serverless and hybrid solutions
Learn to use enterprise security guidelines for Azure deployment

Offering the most up to date information and practical advice, Microsoft Azure Infrastructure Services for Architects: Designing Cloud Solutions is an essential resource for IT administrators, consultants and engineers responsible for learning, designing, implementing, managing, and maintaining Microsoft virtualization and cloud technologies.

The book is yet to release. You can pre-order it.


June 30, 2019  2:13 PM

SQL Server Big Data Clusters Revealed: A Book @Amazon to Pre-Order

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Big Data

Title: SQL Server Big Data Clusters Revealed: The Data Virtualization, Data Lake, and AI Platform by Benjamin Weissman and Enrico van de Laar

Link: https://www.amazon.com/Server-Data-Clusters-Revealed-Virtualization/dp/1484251091/ref=sr_1_3?keywords=virtualization&qid=1561883567&s=books&sr=1-3

Editorial Review: As on Amazon.com
From the Back Cover

Use this guide to one of SQL Server 2019’s latest and most impactful features―Big Data Clusters―that combines large volumes of non-relational data for analysis along with data stored relationally inside a SQL Server database.

Big Data Clusters is a feature set covering data virtualization, distributed computing, and relational databases and provides a complete AI platform across the entire cluster environment. This book shows you how to deploy, manage, and use Big Data Clusters. For example, you will learn how to combine data stored on the HDFS file system together with data stored inside the SQL Server instances that make up the Big Data Cluster.

Filled with clear examples and use cases, SQL Server Big Data Clusters Revealed provides everything necessary to get started working with SQL Server 2019 Big Data Clusters. You will learn about the architectural foundations that are made up from Kubernetes, Spark, HDFS, and SQL Server on Linux. You then are shown how to configure and deploy Big Data Clusters in on-premises environments or in the cloud. Next, you are taught about querying. You will learn to write queries in Transact-SQL―taking advantage of skills you have honed for years―and with those queries you will be able to examine and analyze data from a wide variety of sources such as Apache Spark.

Through the theoretical foundation provided in this book and easy-to-follow example scripts and notebooks, you will be ready to use and unveil the full potential of SQL Server 2019: combining different types of data spread across widely disparate sources into a single view that is useful for business intelligence and machine learning analysis.

You will:

Install, manage, and troubleshoot Big Data Clusters in cloud or on-premise environments
Analyze large volumes of data directly from SQL Server and/or Apache Spark
Manage data stored in HDFS from SQL Server as if it were relational data
Implement advanced analytics solutions through machine learning and AI
Expose different data sources as a single logical source using data virtualization


June 30, 2019  1:57 PM

Virtual Appliance Performance Achieves A New Landmark @StorOne_Inc

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Seagate, SSD, Virtual Appliance

A video link given at the end of this article will help you understand the things better. Do have a look after you finish the article. The video is about StorONE Virtual Appliance Performance. I am covering here in this article the same along with StorONE’s recent partnership with Seagate that aims to maximize SSD Performance Specs. Basically, a unified virtual appliance with Seagate Drives is an ideal situation for I/O-intensive environments. On 25th June StorONE announces a unique achievement of record speeds using Seagate SSD drives and StorONE’s TRU™ S1 Software Defined Storage solution. StorONE, in fact, in this recent performance testing, combined Seagate’s enterprise-class SSDs with its software in a virtual appliance configuration. The combination, as a matter of fact, attained a breakthrough half a million IOPS with 24 SSDs provided by Seagate running an all enterprise-class data protection features.
For any such or similar kind of environment what matters most is the seamless throughput. In this particular case, the high-availability, failure-proof VMware cluster attained this phenomenal throughput on random reads (4K) and 180,000 IOPS on random writes (4K). The latency in both cases was less than 0.2 milliseconds. This is a significant achievement as it eliminates the storage performance issues arising due to server virtualization. That makes the StorONE Unified Virtual Appliance configuration with Seagate drives a unique and most appropriate preposition. In this case, StorONE software runs in an ESXi VM that doesn’t require any complicated setup or configuration. It’s a simple environment that uses very little memory and compute. In fact, it doesn’t require any high-end server configurations, huge memory caches, or any other expensive hardware components. This virtual appliance can support all major hypervisor environments that include VMware, Oracle VM, Hyper-V, or KVM.

Virtual Appliance Performance

Ravi Naik, CIO and SVP Corporate Strategy at Seagate, says, “ StorONE’s software elicits the extreme performance of our SSDs in meeting the needs of mission-critical applications. With StorONE’s software, users can achieve the maximum utilization of resources from our solid-state drives, getting the best possible TCO via optimized storage capacity and performance.”

Gal Naor, StorONE co-founder, and CEO says, “Seagate drives offer superior drive design and reliability for enterprise use, and our tests show they offer performance beyond their competitors. We are very proud to have the world’s leading data solutions company as an early investor and partner, on whose drive technology we can reach incredible throughput and deliver all essential data services for managing and protecting data.”

For a performance demo of the Unified Virtual Appliance with Seagate SSDs visit https://youtu.be/KMDzOzEr79o


June 16, 2019  11:24 PM

SwiftStack Data Analytics Solution With Alluxio @SwiftStack @alluxio

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Data Analytics, Data storage, SwiftStack

Do you know what is the common factor among Tata Communications, eBay, Cisco, Technical Assistance Center, Kaidee, Counsyl, Surf Sara, DC Blox, Hepsiburada, DRN, OMRF, Bet365, ESU10, Tieto, TCT, Premiere Digital, Enter, NSS Labs, PayPal, yp.ca, Douglas, Verizon, Pac12, and Burton. All of them are SwiftStack customers. Happy and delighted customers. For years. We are talking about the best Data Analytics solution across the globe.

Data Analytics Solution

Source SwiftStack

Now, let us look at another set of global class companies. These are Wells Fargo, Lenovo, PayPal, DBS, Walmart, Hutai Securities, Myntra, VIPShop, TWO Sigma, Oracle, Comcast, JD.com, Samsung, Netease games, DiDi, Tencent, ESRI, Ctrip, Nielsen, Caesars, Barclays, Baidu, Swisscom, and China Mobile. All of these are Alluxio customers. So when SwiftStack and Alluxio come out together with a solution, it has to be a state-of-the-art unique one. And what the CTOs and CIOs of companies that are not the customers of SwiftStack or Alluxio supposed to do? They must immediately understand what is it all about. And what a world-class Data Analytics Solution is all about.

Data Analytics Solution

Source SwiftStack

Following that three steps are very important for them. They should participate in a live demo related to this. Have a technical deep dive discussion with SwiftStack and Alluxio. Download the product and try it. Because without tasting the pudding you will never be able to understand its power and beauty. So this Alluxio and SwiftStack partnership brings SwiftStack Data Analytics Solution with Alluxio. It is a seamless edge to core to cloud. Current enterprise situation across the globe is quite alarming. Most of the businesses are in a fix. Their dilemma is between on-premise and cloud.

SwiftStack Data Analytics Solution with Alluxio

The data is lying in silos and in huge volumes. Existing solutions seem short in promises and commitments. Most of the products in the market lack enterprise readiness. On top of that, in the cloud, OpEx is increasing at a faster pace. The pressures are increasing. Are you safe with current design and solutions?

Data Analytics Solution

Source SwiftStack

None of the existing Data Analytics vendors seem to have an iota of confidence in their products in catering to the four rapidly changing trends viz separation of compute and storage, hybrid multi-cloud environments, the rise of the object store, and self-service data across the enterprise. Data ecosystem will not be the same next year. Look at the Data ecosystem beta version where compute was solely dependent on Hadoop Map Reduce and Storage on Hadoop HDFS.

Data Analytics Solution

Source SwiftStack

The next maturity model was Data Ecosystem 1.0 where Compute had many players like Presto, Spark, Flink, Cafe, Apache HBase, and Tensorflow while Hadoop Map Reduce stayed a strong contender as before. On the other hand, on the storage front, a lot of players emerged like Amazon S3, Azure, Ceph, HPE, IBM, Dell EMC ECS, Hitachi, and Minio in addition to the legendary Hadoop HDFS. Things are changing rapidly.

Data Analytics Solution

Source: SwiftStack

The SwiftStack Data Analytics Solution with Alluxio’s Accelerated Compute, Data accessibility, and Elasticity ensure Multi-cloud storage and Data Management. You name a business use case and there is a seamless solution available. Amita Potnis, Research Director at IDC’s Infrastructure System Platform and Technologies Group says, “Infrastructure challenges are the primary inhibitor for broader adoption of AI/ML workflows. SwiftStack’s Multi-Cloud data management solution is first of its kind in the industry and effectively handles storage I/O challenges faced by the edge to core to cloud, large scale AI/ML data pipelines”


June 15, 2019  9:45 PM

Do You Get A Comprehensive View of IT Assets and Activities @Cynet360

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Cyber security, cyber-attacks, cyber-crimes, IT asset management

I am sure keeping a track of IT Assets in the organization is a big pain for all IT Heads. There are assets that are in use. There are assets that are there as standbys. And then there are assets that are old, obsolete, or out of use that are lying idle in some storeroom of the organization. Losing a 1TB laptop with no data in it is not a major issue. But losing even a few MB of data that is crucial and confidential can create a big issue that might lead to a loss of reputation and finances. All this makes a comprehensive view of IT Assets, their activities, and their movements more important. There is nothing like it if the solution comes from one of the top leaders in the industry and that too free of cost. Cynet recently launched free proactive visibility solution.

IT Assets

This Proactive Visibility Offering from Cynet is for a comprehensive view of IT Assets of your organization in a real-time environment with all checks and alerts in it. The tool is equally beneficial for service providers as well as enterprises. Whatever you can think of is there in it with detailed inventory reporting and attack surface elimination. This is an on-demand solution empowering an organization to enhance security and productivity tasks on an ongoing basis. In fact, IT/Security decision makers find it an effective catalyst to the tools they use in giving comprehensive visibility to boost critical IT operations and productivity of both end users as well as service providers. The rate, intensity, and severity of cybercrimes across the globe are increasing at a tremendous pace. In a span of the last four years, the cost of cybercrime has increased 4 times which is an alarm raising situation.

IT Assets

A recent research report from Juniper titled The Future of Cybercrime & Security: Financial and Corporate Threats & Mitigations estimates the total cost of cybercrime to exceed $2 trillion this year. Larger the attack surface in an organization larger is the number of points or attack vectors hence large is the scale of threats or vulnerabilities arising out of it. A professional hacker or cybercriminal needs just one of these points to penetrate your organization’s databases. A recent research shows around 60 percent of organizations across the globe have over 100,000 folders open for every employee. Cynet’s Proactive Visibility empowers security administrators to enhance the efficiency of security monitoring workflows.

Eyal Gruner, Founder & President of Cynet says,

“There is a critical need for a single-source-of-truth where users get a complete visual of both positive and negative actions/processing taking place across their centralized or distributed IT infrastructure. Our free Proactive Visibility Experience delivers the operational reality of having all of this data available with the click of a button, allowing for accurate data-driven decision making.”

IT Assets

Key components of Cynet 360 Security Platform includes Cynet Proactive Visibility, Attack Protection, and Response Orchestration. It integrates various globally accepted technologies like NGAV, Network Analytics, EDP, UBA, and deception. As a matter of fact, Cynet is the first vendor to integrate all the essential breach protection possibilities. These consolidated capabilities are then applied to the complete internal environment in a single interface.

“It’s a rather worn-out phrase that you can’t secure what you don’t know, but it’s true all the same,” added Gruner, “We’re really able to boost organizations in the right direction with our highly available, high-resolution knowledge of the user’s environment. Use of the Proactive Visibility offering is the equivalent of a good opening move in chess because it narrows down the risks that user’s face and enables the enterprise to focus on what really matters.”

Gruner concludes.

Whether you are an end-user organization or a service provider, to gain a 14-day access to Cynet 360 platform that includes full visibility into your IT environment, IT Assets, host configurations, user account activities, installed software, network traffic, and password hygiene, click on the link: free access to its end-to-end visibility capabilities.


June 10, 2019  8:53 PM

Deskless Workforce Management Becomes Easier with StaffConnect

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
mobile workforce, Remote worker

The biggest challenge among the enterprises across the globe is to squeeze the gap between the important information related to the organization and its mobile employees or deskless workforce. This challenge became a thing of the past for the organizations who adopted StaffConnect well in time. For those who already tasted the pudding are reaping the fruits. For those who haven’t deployed StaffConnect so far, it is a golden chance to go for StaffConnect Next-Generation Mobile Employee Engagement Platform. Why is it necessary? Well, the first and foremost is the organizations with remote workers is struggling to Tackle Employee Engagement Crisis. The second important point is StaffConnect Inspiring The Emotionally Connected Enterprises. Otherwise, it becomes a pain to fill the gap between the emotional disconnect between the employees and the organization. The launch of StaffConnect v2.4 brings enhanced enterprise features with powerful analytics.

Besides enhanced enterprise features and powerful analytics, StaffConnect v2.4 also comes with advanced functionality for segmenting users and delivering targeted content and messages to the mobile workforce of an organization irrespective of their location. StaffConnect is a pioneer of mobile employee engagement solutions for the deskless workforce. StaffConnect v2.4 has become more powerful with features such as 5,000 unique groups assignable to 100 unique categories. Administrators can now easily and efficiently create targeted content that is not only relevant and meaningful but can also be shared quickly and directly with targeted audiences across an organization. Now, those targeted audiences include employees, department, roles, building, etc. What it means is that with StaffConnect v2.4, each feed can be made as complex or as simple as per the requirement of the organization. Each feed is easily customizable and configurable by the administrators without any assistance.

Deskless workforce

How Can Enterprises Overcome the Global Employee Engagement Crisis That Impacts 2.7 Billion Deskless Employees” is a very interesting eBook released by StaffConnect. It comes with astonishing facts about its 80 percent of the global workforce is deployed offsite or remotely. The eBook provides ample statistics and insights on the various angles on this aspect highlighting the pain areas and financial losses such organizations are making due to its deskless workforce not being connected well with the organization. Betsi Cadwaladr University Health Board (BCUHB) is one of the key customers of StaffConnect.

Aaron Haley, Communications Officer at BCUHB says, “While email works for our desk-based staff, there’s a big contingency of our workforce who just can’t find the time to get on to a computer as part of their working day. We wanted the new internal communications tool to be completely voluntary, and we wanted to demonstrate our commitment to improving internal communications with a platform that meets the needs of all our employees, regardless of their role or location. StaffConnect ticked all of those boxes.”

Deskless Workforce

Geraldine Osman, CMO, StaffConnect says, “StaffConnect v2.4 offers a powerful combination of advanced analytics and sophisticated audience segmentation capabilities in order to enable our large global enterprise customers to create, tailor and deliver highly targeted content to specific audiences. StaffConnect v2.4’s audience and content analytics then work across the platform and across the organization to track and measure every aspect of the communications program, enabling HR and Communications professionals to fine-tune strategy and ensure organizational goals are met.”

Osman continues, “With StaffConnect v2.4 HR and communications can now deliver an enhanced employee experience (EX) by delivering relevant and personalized information which creates deeper engagement. Highly engaged employees are more committed to company goals, productive and dedicated to ensuring optimal customer experiences (CX). Organizations that consistently deliver superior CX are proven to earn and enjoy increased revenues, profits and shareholder value.”


June 9, 2019  9:23 PM

How To Push Boundaries of Enterprise SSDs With @ViolinSystems

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Enterprise storage, Flash Array, NVM, SSD, Violin Systems

When you think of Extreme Performance Storage or Enterprise SSDs, it is Violin Systems. Otherwise, why two pioneers in their respective fields would decide to form a strategic development alliance for the purpose of joint development in further advancing performance-based SSDs with the integration of Violin advanced technologies. The two masters to enter into a strategic partnership to integrate Phison’s E12DC controller with Violin’s state-of-the-art technology and extreme-performance storage systems are Violin Systems and Phison Electronics Corporation. Violin Systems is a global leader in all-flash arrays. The company has been leading the market in innovating consistent extreme performance its unique Flash Fabric Architecture and vRAID simultaneously ensuring continuity of operations aided by its advanced enterprise data services. Violin is continuously evolving from custom to customized SSDs without compromising performance or resiliency. The industry standards of NVMs are compelled to catch-up to the level of Violin’s proprietary technology.

Phison is among the top leaders in NAND flash controllers and applications. It ships more than 600 million controllers annually. That itself demonstrates its capabilities and the top position in enterprise-class technology. While Violin is one of Phison’s customers, it is also a joint technology partner of Phison. Both working collaboratively in performance optimization. With the NVMe standard interface enhancement with its proprietary performance-enhancing technologies, Violin performance storage solution along with its advanced enterprise software has reached to an unmatched level. While promising to provide consistent extreme performance for industry’s top requirements like Enterprise SSDs, OLTP, real-time analytics, virtualization applications, SQL databases, etc., it goes a level up in providing support to AI, M2M learning, and IIoT applications. All this is happening with zero tolerance of any performance tag.

Enterprise SSDs achieve a new paradigm

K.S. Pua, CEO, Phison says, “We aim to be our customers’ most dependable IT business partner. Phison is constantly striving to find new ways to integrate capabilities into our product sets that add true value for our customers. The Phison E12DC enterprise controller builds on the award-winning success of the E12 solution by adding robust algorithms to ensure predictable latency and consistent I/O. With the E12 already strongly established in the consumer space, we are able to accelerate our growth in the enterprise-embedded market. We believe that partnering with Violin and leveraging each other’s technology and expertise allows us to make a big impact on the market.”

Eric Burgener, research vice president, Infrastructure Systems, Platforms and Technologies Group, IDC says, “The tight collaboration of extreme-performance storage systems expert Violin with resources of an industry leader in flash controllers and NAND solutions like Phison is a great move that can enable rapid innovation working with standards-based interfaces.”

Mark Lewis, Chairman CEO of Violin Systems says, “While NVMe is becoming a growing standard in performance storage, we have been able to further enhance the advancements made with the technology by leveraging our patented technology to provide enterprises extreme, consistent performances. This joint development with Phison allows both companies to push the boundaries of enterprise SSDs and bring true innovation to the market through standards-based technology vs. proprietary technology.”


May 29, 2019  10:23 PM

5 Books on Agile Testing @Amazon #AgileTesting

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Agile Software, Agile testing

Book 1:

Agile Testing: A Practical Guide for Testers and Agile Teams by Lisa Crispin and Janet Gregory

Synopsis from Amazon:

Two of the industry’s most experienced agile testing practitioners and consultants, Lisa Crispin and Janet Gregory, have teamed up to bring you the definitive answers to these questions and many others. In Agile Testing, Crispin and Gregory define agile testing and illustrate the tester’s role with examples from real agile teams. They teach you how to use the agile testing quadrants to identify what testing is needed, who should do it, and what tools might help. The book chronicles an agile software development iteration from the viewpoint of a tester and explains the seven key success factors
of agile testing.

Readers will come away from this book understanding

How to get testers engaged in agile development
Where testers and QA managers fit on an agile team
What to look for when hiring an agile tester
How to transition from a traditional cycle to agile development
How to complete testing activities in short iterations
How to use tests to successfully guide development
How to overcome barriers to test automation

This book is a must for agile testers, agile teams, their managers, and their customers.

The eBook edition of Agile Testing also is available as part of a two-eBook collection, The Agile Testing Collection (9780134190624).

Book 2: More Agile Testing: Learning Journeys for the Whole Team (Addison-Wesley Signature Series (Cohn)) by Janet Gregory and Lisa Crispin

Synopsis from Amazon:

Janet Gregory and Lisa Crispin pioneered the agile testing discipline with their previous work, Agile Testing. Now, in More Agile Testing, they reflect on all they’ve learned since. They address crucial emerging issues, share evolved agile practices, and cover key issues agile testers have asked to learn more about.

Packed with new examples from real teams, this insightful guide offers detailed information about adapting agile testing for your environment; learning from experience and continually improving your test processes; scaling agile testing across teams; and overcoming the pitfalls of automated testing. You’ll find brand-new coverage of agile testing for the enterprise, distributed teams, mobile/embedded systems, regulated environments, data warehouse/BI systems, and DevOps practices.

You’ll come away understanding

• How to clarify testing activities within the team

• Ways to collaborate with business experts to identify valuable features and deliver the right capabilities

• How to design automated tests for superior reliability and easier maintenance

• How agile team members can improve and expand their testing skills

• How to plan “just enough,” balancing small increments with larger feature sets and the entire system

• How to use testing to identify and mitigate risks associated with your current agile processes and to prevent defects

• How to address challenges within your product or organizational context

• How to perform exploratory testing using “personas” and “tours”

• Exploratory testing approaches that engage the whole team, using test charters with session- and thread-based techniques

• How to bring new agile testers up to speed quickly–without overwhelming them

The eBook edition of More Agile Testing also is available as part of a two-eBook collection, The Agile Testing Collection (9780134190624).


May 29, 2019  10:16 PM

2 Books on DevOps @Amazon #DevOps #DevOpsBooks

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
DevOps

The DevOps Handbook:: How to Create World-Class Agility, Reliability, and Security in Technology Organizations by Gene Kim, Jez Humble, et al.

Synopsis from Amazon:

Increase profitability, elevate work culture and exceed productivity goals through DevOps practices.

More than ever, effective management of technology is critical for business competitiveness. For decades, technology leaders have struggled to balance agility, reliability, and security. The consequences of failure have never been greater―whether it’s the healthcare.gov debacle, cardholder data breaches, or missing the boat with Big Data in the cloud.

And yet, high performers using DevOps principles, such as Google, Amazon, Facebook, Etsy, and Netflix, are routinely and reliably deploying code into production hundreds, or even thousands, of times per day.

Following in the footsteps of The Phoenix Project, The DevOps Handbook shows leaders how to replicate these incredible outcomes, by showing how to integrate Product Management, Development, QA, IT Operations, and Information Security to elevate your company and win in the marketplace.

Book 2: What is DevOps? by Mike Loukides

Synopsis from Amazon:

Have we entered the age of NoOps infrastructures? Hardly. Old-style system administrators may be disappearing in the face of automation and cloud computing, but operations have become more significant than ever. As this O’Reilly Radar Report explains, we’re moving into a more complex arrangement known as “DevOps.”

Mike Loukides, O’Reilly’s VP of Content Strategy, provides an incisive look into this new world of operations, where IT specialists are becoming part of the development team. In an environment with thousands of servers, these specialists now write the code that maintains the infrastructure. Even applications that run in the cloud have to be resilient and fault tolerant, need to be monitored, and must adjust to huge swings in load. That was underscored by Amazon’s EBS outage last year.

From the discussions at O’Reilly’s Velocity Conference, it’s evident that many operations specialists are quickly adapting to the DevOps reality. But as a whole, the industry has just scratched the surface. This report tells you why.


May 29, 2019  10:10 PM

A Book on Test Automation @Amazon #TestAutomation

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Test Automation

Book 1: Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation (Adobe Reader) (Addison-Wesley Signature Series (Fowler)) by Jez Humble and David Farley

Synopsis from Amazon:

Winner of the 2011 Jolt Excellence Award!

Getting software released to users is often a painful, risky, and time-consuming process.

This groundbreaking new book sets out the principles and technical practices that enable

rapid, incremental delivery of high quality, valuable new functionality to users. Through

automation of the build, deployment, and testing process, and improved collaboration between

developers, testers, and operations, delivery teams can get changes released in a matter of hours—

sometimes even minutes–no matter what the size of a project or the complexity of its code base.

Jez Humble and David Farley begin by presenting the foundations of a rapid, reliable, low-risk

delivery process. Next, they introduce the “deployment pipeline,” an automated process for

managing all changes, from check-in to release. Finally, they discuss the “ecosystem” needed to

support continuous delivery, from infrastructure, data and configuration management to governance.

The authors introduce state-of-the-art techniques, including automated infrastructure management

and data migration, and the use of virtualization. For each, they review key issues, identify best

practices, and demonstrate how to mitigate risks. Coverage includes

• Automating all facets of building, integrating, testing, and deploying software

• Implementing deployment pipelines at team and organizational levels

• Improving collaboration between developers, testers, and operations

• Developing features incrementally on large and distributed teams

• Implementing an effective configuration management strategy

• Automating acceptance testing, from analysis to implementation

• Testing capacity and other non-functional requirements

• Implementing continuous deployment and zero-downtime releases

• Managing infrastructure, data, components and dependencies

• Navigating risk management, compliance, and auditing

Whether you’re a developer, systems administrator, tester, or manager, this book will help your

organization move from idea to release faster than ever—so you can deliver value to your business

rapidly and reliably.


May 29, 2019  9:58 PM

2 Books on Software Testing @Amazon #SoftwareTesting

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Software testing

Book 1: Software Testing by Ron Patton

Synopsis from Amazon:

Software Testing, Second Edition provides practical insight into the world of software testing and quality assurance. Learn how to find problems in any computer program, how to plan an effective test approach and how to tell when the software is ready for release. Updated from the previous edition in 2000 to include a chapter that specifically deals with testing software for security bugs, the processes, and techniques used throughout the book are timeless. This book is an excellent investment if you want to better understand what your Software Test team does or you want to write better software.

Book 2: Lessons Learned in Software Testing: A Context-Driven Approach by Cem Kaner, James Bach, et al.

Synopsis from Amazon:

Decades of software testing experience condensed into the most important lessons learned.

The world’s leading software testing experts lend you their wisdom and years of experience to help you avoid the most common mistakes in testing software. Each lesson is an assertion related to software testing, followed by an explanation or example that shows you the how, when, and why of the testing lesson. More than just tips, tricks, and pitfalls to avoid, Lessons Learned in Software Testing speeds you through the critical testing phase of the software development project without the extensive trial and error it normally takes to do so. The ultimate resource for software testers and developers at every level of expertise, this guidebook features:
* Over 200 lessons gleaned from over 30 years of combined testing experience
* Tips, tricks, and common pitfalls to avoid by simply reading the book rather than finding out the hard way
* Lessons for all key topic areas, including test design, test management, testing strategies, and bug reporting
* Explanations and examples of each testing trouble spot help illustrate each lesson’s assertion


May 29, 2019  9:49 PM

5 Books for Project Managers @Amazon #ProjectManager

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Project Manager

Book 1: A Project Manager’s Book of Forms: A Companion to the PMBOK Guide by Snyder Dionisio, Cynthia

Book 2: The New One-Page Project Manager: Communicate and Manage Any Project With A Single Sheet of Paper by Clark A. Campbell and Mick Campbell

Synopsis on Amazon:

How to manage any project on just one piece of paper

The New One-Page Project Manager demonstrates how to efficiently and effectively communicate essential elements of a project’s status. The hands of a pocket watch reveal the time of day without following every spring, cog, and movement behind the face. Similarly, an OPPM template reduces any project—no matter how large or complicated—to a simple one-page document, perfect for communicating to upper management and other project stakeholders. Now in its Second Edition, this practical guide, currently saving time and effort in thousands of organizations worldwide, has itself been simplified, then refined and extended to include the innovative AgileOPPM™

This Second Edition will include new material and updates including an introduction of the ground-breaking AgileOPPM™ and an overview of MyOPPM™ template builder, available online
Includes references throughout the book to the affiliated sections in the Project Management Body of Knowledge (PMBOK®)
Shows templates for the Project Management Office (PMO)

This new and updated Second Edition will help you master the one-page approach to both traditional project management and Agile project management.

(PMBOK is a registered mark of the Project Management Institute, Inc.)

Book 3: The lazy project manager, second edition by Peter Taylor

Synopsis from Amazon:

Peter Taylor reveals how adopting a more focused approach to life, projects and work can make you twice as productive.
The Lazy Project Manager has been the project management book to own in the last six years and now this new edition brings the art of lazy productivity bang up to date. Anyone can apply the simple techniques of lazy project management to their own activities in order to work more effectively and improve their work-life balance. By concentrating your project management and learning to exercise effort where it really matters, you can learn to work smarter. Welcome to the home of ‘productive laziness’. Inside this insightful and informative book you’ll discover:
• The intelligence of laziness – why smart, lazy people have the edge over others;
• Why The Jungle Book’s ‘Bare Necessities’ should be the productive lazy theme tune;
• How to get the maximum output for a minimized input;
• Quick tips to productive lazy heaven, including avoiding project surprises and being lazy on several projects at once.
You’ll also find out why you should never go ballooning, how to deliver a good Oscar acceptance speech, and why it is important for your team that you read the newspaper each morning. And yes, you may even learn some, quick, simple but incredibly important things about project management. If you are lazy enough.


May 29, 2019  9:40 PM

3 Books On Project Management On @Amazon #ProjectManagement

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Project management

Book 1: Project Management for the Unofficial Project Manager: A FranklinCovey Title by Kory Kogon, Suzette Blakemore, and James Wood

Its synopsis on Amazon reads as:

No project management training? No problem!

In today’s workplace, employees are routinely expected to coordinate and manage projects. Yet, chances are, you aren’t formally trained in managing projects—you’re an unofficial project manager.

FranklinCovey experts Kory Kogon, Suzette Blakemore, and James Wood understand the importance of leadership in project completion and explain that people are crucial in the formula for success.

Project Management for the Unofficial Project Manager offers practical, real-world insights for effective project management and guides you through the essentials of the people and project management process:

Initiate
Plan
Execute
Monitor/Control
Close

Unofficial project managers in any arena will benefit from the accessible, engaging real-life anecdotes, memorable “Project Management Proverbs,” and quick reviews at the end of each chapter.

If you’re struggling to keep your projects organized, this book is for you. If you manage projects without the benefit of a team, this book is also for you. Change the way you think about project management—”project manager” may not be your official title or necessarily your dream job, but with the right strategies, you can excel.

Book 2:

A Guide to the Project Management Body of Knowledge (PMBOK® Guide)–Sixth Edition by Project Management Institute

Book 3: The Fast Forward MBA in Project Management (Fast Forward MBA Series) by Eric Verzuh

Synopsis from Amazon:

The all-inclusive guide to exceptional project management

The Fast Forward MBA in Project Management is a comprehensive guide to real-world project management methods, tools, and techniques. Practical, easy-to-use, and deeply thorough, this book gives you the answers you need now. You’ll find the cutting-edge ideas and hard-won wisdom of one of the field’s leading experts, delivered in short, lively segments that address common management issues. Brief descriptions of important concepts, tips on real-world applications, and compact case studies illustrate the most sought-after skills and the pitfalls you should watch out for. This new fifth edition features new case studies, new information on engaging stakeholders, change management, new guidance on using Agile techniques, and new content that integrates current events and trends in the project management sphere.

Project management is a complex role, with seemingly conflicting demands that must be coordinated into a single, overarching, executable strategy — all within a certain time, resource, and budget constraints. This book shows you how to get it all together and get it done, with expert guidance every step of the way.

Navigate complex management issues effectively
Master key concepts and real-world applications
Learn from case studies of today’s leading experts
Keep your project on track, on time, and on budget

From finding the right sponsor to clarifying objectives to setting a realistic schedule and budget projection, all across different departments, executive levels, or technical domains, project management incorporates a wide range of competencies. The Fast Forward MBA in Project Management shows you what you need to know, the best way to do it, and what to watch out for along the way.


May 28, 2019  11:20 PM

Future of Voice Based Assistants – Ideas of 2 Key Players @Microsoft

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Bot, Cortana, Microsoft

Jonathan Foster and Deborah Harrison, both are non-technical by qualification and profession. Yet they hold an important position in one of the top technology companies in the world – Microsoft. Both are jointly engaged in a special task. They are designing a unique personality for various voice-based personal assistants at Microsoft. For instance one of the key projects is to create a personality for Microsoft’s Cortana that is one of their AI-powered assistants holding a lot of promises for the coming future. Foster has been writing for film, theatre, and television for last many years. He is leading a team at Microsoft. The team is responsible for creating guidelines for AI-powered conversational bots and assistants across various Microsoft platforms. Harrison is a part of Foster’s team. She was the first to build the user interface for Cortana.

Deborah says, “For quite some time, in the beginning, I was the only person who was on the writing team for Cortana. It was a pretty forward-thinking feature. There is no relationship between writing for a digital agent and writing for any other user interface except for the fact that it’s all words and I’m trying to create a connection. Initially, we were looking at straightforward strings because we started with some scenarios like setting an alarm or checking the calendar. But while writing those strings, we started thinking about what it would sound like and we realized that the agent should have a more concrete identity so that we could tell what to say when and under what circumstances it should sound apologetic versus more confident and so on.”

Foster says, “We don’t want to get into a situation where we’re creating life-like interaction models that are addictive. Tech can move in that direction when you are so excited about the potential of what you can build that you’re not thinking about its impact. Thankfully, Microsoft has been a leader in ethics all up and we are a mature company that can pause and think about this stuff.”

You can read the complete interview here: https://news.microsoft.com/en-in/features/building-personalities-ai-jonathan-foster-deborah-harrison/


May 23, 2019  10:39 PM

5 Reasons To Join Digital Predictive Maintenance Conference

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Predictive maintenance

So, whether you are joining me in the upcoming Digital Predictive Maintenance Conference Bangkok September 2019 or not is solely your discretion. But it is my privilege to share some good reasons why you should attend. We all strive to avoid equipment failures irrespective of whether we are a product or a service company. After all, it is not a question of survival or existence but of excelling and leading the tough race with peer businesses. Digital Predictive Maintenance can help us to a large extent in that. We all rely on equipment in whatever industry we work in. In the factory, if it is machine and tools, in the office it is laptops, network, servers, internet, and many other types of equipment. These days we can get real-time monitoring along with an ability to analyze large volumes of relevant data.

This analysis can help in predicting equipment failure with instant notification to concerned stakeholders to come into appropriate actions and avoid any delays. Network remote sensors, cloud systems, machine learning, internet of things, and big data are some of the emerging technologies to empower digital predictive maintenance. The primary application of industrial analytics in the coming years will be the predictive maintenance of industrial machinery. Unplanned downtime caused mostly by equipment failure costs $50 billion per year to industrial manufacturers. This is a study concluded recently by The Wall Street Journal and Emerson. Digital Predictive Maintenance not only saves up to 40% on maintenance costs but it also reduces capital investments in new equipment to an extent of 5%. Industry professionals must learn about proven technological concepts in predictive maintenance so as to deploy the most efficient maintenance policy and procedures to optimize equipment reliability, profits, and production uptime.

Comment to know more about Digital Predictive Maintenance Conference

Digital Predictive Maintenance Conference in Bangkok will have more than 60 speakers and 300 delegates. There will be 4 parallel events running that include Sensor Tech, Digital Shutdowns and Turnarounds, Digital EPC, and Digital Predictive Maintenance.


May 19, 2019  9:49 PM

Smart Tech 2019 Focuses On Present And Future of VTS @itriangleindia

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Connected vehicles, Smart cities, smart city

The key focus on the two days Smart Tech 2019 event organized by Smart Mobility Association was on the Present and Future of Vehicle Tracking Technology. Most of the talks and panel discussions revolved around the various aspect of the topic. This was a 2-day industry symposium and hence there was a high level of industry level present at the event not only from the government sector but also from public and private sectors. It emerges that modern-day automotive can’t survive without Technology, Telecom, and mobility sectors. Only then it can evolve as smart automotive. The event was presented by iTriangle Infotech which is India’s largest manufacturer of vehicle telematics devices. The symposium on Vehicle Tracking Technology was very much the need of the hour in order to collaborate and synchronize all industry leaders in this ecosystem. The Smart Mobility Association was quite successful in its initiative.

Smart Tech 2019

If you try to find out the various developments taking place in vehicle tracking technology spectrum across the globe, the situation is very dynamic. In a way, what is happening today is like setting a proper foundation for the future of navigation, connectivity, telecom, technology, and smart transport. All these further lead to smart cities and smart nation. There is a lot of thrust in the country on digitization. Almost Rs. 2 trillion spends are in pipeline for the development of smart cities in India. A lot of focus is thus towards the basic things to be in the place like road construction, traffic management, fleet tracking, etc. A large chunk of public transport is in the process of installation of VT Technology. Smart Tech 2019 is one of its kind event in this regard covering it with 360 degrees perspective including promises, perspectives, challenges, scope, and risks.

Smart Tech 2019

There were more than 250 delegates from various industry segments like Automotive, Telecom, IT, Mobility, etc, in the recently concluded Smart Tech 2019 with a key focus on Vehicle Tracking System. The participation included Ministry of Road Transport and Highways of The Government of India, Automotive Research Association of India (ARAI), International Centre for Automotive Technology (ICAT), Delhi Integrated Multi-Modal Transit System (DIMTS) Limited, and a number of big names in the industry. Rajiv Arora, General Secretary, Smart Mobility Association, says, “The economic scenario of today demands increased productivity while driving down costs, and many businesses are now looking for innovative ways to refine their processes. We are hopeful that this forum shall offer a cross-industry perspective towards this direction, and put forth best practices of vehicle tracking technology in India as well as across the globe,”

Dinesh Tyagi, Director, ICAT said, “Off late, a dire need has been felt to have platforms where discussions can be initiated about new age smart devices and technologies. I am sure Smart Tech 2019 will address this issue and provide a comprehensive knowledge platform for all the industry experts and professionals alike.”


May 17, 2019  9:18 PM

Ozonetel launches its very own Voice Bot Platform

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Chatbot, Cloud communications, Telephony

Creating a voice bot has never been easier – Ozonetel launches its very own Voice Bot Platform. Technology is continuously changing and redefining the way businesses interact with their clientele, rendering customized and tailored recommendations. And the latest trend in business communication is the employment of voice bots. Assistants such as Alexa and Google Assistant are great examples of voice bots; having secured a special place in customers’ homes and retail stores.

Voice bots make life more convenient, so much so, that they have become all pervasive. Their popularity continues to grow with more and more businesses choosing to employ voice bots to communicate with their customers.

Ozonetel, a leading provider of on-demand cloud communication/telephony solutions, has launched its very own Voice Bot Platform. This platform will enable easy development of voice bots for various voice endpoints i.e. telephony, mobile app, websites, and digital assistants. All existing chatbots can effortlessly be ported to the Voice Bot platform. This platform provides a new voice channel for all chatbots.

Chaitanya Chokkareddy, Chief Innovation Officer, Ozonetel, says, “We are very excited about this latest innovation from Ozonetel. We believe, voice bots are ushering in a new era in customer support worldwide. The Ozonetel Voice Bot platform will make it very easy and cost-effective for organizations to voice-enable their existing chatbots and build new ones. Ozonetel has been a pioneer in cloud communication solutions, and this new platform will make it incredibly easy for any organization be it a start-up, mid-sized or a large enterprise—to adopt and experiment with voice bots easily.”


May 13, 2019  6:56 PM

How To Tackle Employee Engagement Crisis @StaffConnectApp

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Employee engagement, Remote worker

Over 2.7 Bn Deskless Employees are part of the global employee engagement crisis causing a big pain for enterprises across the globe. As a matter of fact, the level of disengagement among deskless workers is as high as 80 percent of the global workforce causing billions of dollars of loss annually to organizations. All this is because of the limited reach communication channels. In this context, to address this severe issue, StaffConnect just released a new eBook ‘How Can Enterprises Overcome the Global Employee Engagement Crisis That Impacts 2.7 Billion Deskless Employees’ that is available now to download. StaffConnect is a pioneer in providing mobile employee engagement solutions for the deskless workforce. The eBook touches some very basic questions like, why individuals often become disengaged? It is the enterprises that have workforce deployed remotely and offsite that face these issues most. The eBook provides important information in this regard.

The eBook ‘How Can Enterprises Overcome the Global Employee Engagement Crisis That Impacts 2.7 Billion Deskless Employees’ not only touches the basic pain areas but also provides solutions to them. It is with mobile technologies that can inspire employee engagement and turn this trend around. It is the deskless employees that face higher amounts of limitations than their onsite peer employees. These limitations pertain to communication and access to company systems. These limitations drastically impact their level of engagement. This gradually results negatively in bottom line success. According to the Bureau of National Affairs reports organizations lose more than $11 billion each year because of employee turnover – a direct consequence of employee disengagement. The demand and necessity for a more mobile workforce have increased tremendously for two main reasons – the gig economy and a steady increase in remote working. This has caused inadequate reach to deskless employees working in varied locations.

What many enterprises lack is a prioritized systematic approach for facilitating timely and reliable two-way communication with their deskless employees to ensure high levels of engagement? The need for the right mobile technology tools and platforms for businesses should be considered in order to keep deskless workers in the loop and enhance communication and collaboration. As a matter of fact, most of the deskless employees are not even aware of the basics when it comes to internal communication. That is the reason that a vast majority (as high as 84 percent) of deskless staff does not receive sufficient communication to perform their job effectively as stated by Tribe Inc. The eBook explains how using the StaffConnect platform deskless employees can have access to company information 24/7 using its mobile app all irrespective of role or location. That includes real-time updates from the company and CEO.

Geraldine Osman, CMO, StaffConnect says, “In a world that’s increasingly digitally driven and focused—combined with a shift toward a workforce that is now primarily deskless—the key to increasing employee engagement is integrally connected to technology. To effectively drive engagement across the entire organization, businesses need to implement mobile-enabled apps that are capable of reaching every employee and delivering an engaging user experience. This prevents the silos between office and field-based employees and facilitates a more unified and positive culture that ultimately leads to better performance, retention and customer satisfaction.”

To download and read, “How Can Enterprises Overcome the Global Employee Engagement Crisis That Impacts 2.7 Billion Deskless Employees,” visit: https://www.staffconnectapp.com/download-the-deskless-workforce-ebook/

Video Links:
The Deskless Workforce – https://www.youtube.com/watch?v=PInhpPdo5rc
The Impact of Employee Disengagement – https://www.youtube.com/watch?v=9MZdk-Zx3OY

Betsi Cadwaladr University Health Board (BCUHB) is the largest health organization in Wales delivering a full range of primary, community, mental health, and acute medical services from three main hospitals. In addition, there is a network of 38 different community hospitals, health centers, clinics, offices, mental health units, and community team bases with a workforce of over 17,000 employees serving more than 760,000 people across all six counties in North Wales. BCUHB has attained a tremendous success in employee engagement among their deskless staff by means of unified two-way communication for the deskless workforce with the StaffConnect Mobile Employee Engagement Platform.

Aaron Haley, Communications Officer at BCUHB says, “While email works for our desk-based staff, there’s a big contingency of our workforce who just can’t find the time to get on to a computer as part of their working day. We wanted the new internal communications tool to be completely voluntary, and we wanted to demonstrate our commitment to improving internal communications with a platform that meets the needs of all our employees, regardless of their role or location.”


May 2, 2019  7:30 PM

Stateless Luxon For Colocation and MSPs @bestateless

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Colocation, Data Center, Hybrid cloud, Managed service providers, MSP

Stateless Luxon is a new software-defined interconnect product from Stateless Inc. This software-based product creates flawless connectivity between various data centers and between the data center to the hypercloud. Before going any further in details of the product, let’s understand a bit about the rich background of the company that delivers this fabulous product, Luxon. Stateless Inc was founded by Eric Keller, CTO, and Murad Kablan, CEO. Both are Ph.D. on the academic front. With their successful work experience with AT&T and IBM, they both realized the need to develop it while noticing first hand how network virtualization had failed to a large extent. They also authored Stateless Network Functions. That means network virtualization had been an overhyped subject with least success. The founders of Stateless Inc. have a very simple yet powerful vision. Their vision statement says – ‘Give the World the power to simply create customized network on-demand’.

Stateless Luxon

If developing Stateless Luxon had been so simple, it would have been done a couple of decades back. That means even the tech giants in the market in this field had been craving to achieve it but with no success. Perhaps they didn’t have the right direction. Let’s look at some interesting figures. The average number of clouds used per enterprise is 5 (Source: https://www.cio.com/article/3267571/it-governance-critical-as-cloud-adoption-soars-to-96-percent-in-2018.html). 80% of enterprises will shut down their data centers by 2025 (Source: https://blogs.gartner.com/david_cappuccio/2018/07/26/the-data-center-is-dead/). The interconnectivity bandwidth CAGR through 2021 will touch 48% (Source: http://phx.corporate-ir.net/External.File?t=1&item=VHlwZT0yfFBhcmVudElEPTUyNzIyNTl8Q2hpbGRJRD02OTU4MDA=). This clearly indicates a rapidly rising global trend that is Multi-site + Hybrid. In fact, it is already there. Monetizing multi-site + Hybrid is the biggest challenge among colocation and cloud managed service providers. They have started focusing on monetizing links into the data centers. As a matter of fact, for them, there is a need to monetize all connectivity links.

Stateless Luxon Is The Future

The situation is like chasing a moving target. In fact, in today’s dynamic and multi-hybrid cloud ecosystem, dedicated L1 + L2 network appliances are not able to deliver. It’s is a complex situation of new & unique applications, new network connections, and new vectors to monetize. SOFTWARE-DEFINED INTERCONNECT (SD-IX) Deliver, control and monetize composable Layer 3+ network services to interconnect points through software. That’s the Luxon Software platform. Four key features of Stateless Luxon are Multitenant, Automated, API-driven visibility, and Consolidated & Evolvable. Basically, with Luxon, you don’t need to buy new appliances. In fact, new functionalities can be easily deployed through software. General availability Q3, 2019.

Philbert Shih, Managing Director, Structure Research says, “Colocation and cloud service providers are set to experience growing demand as enterprises move away from on-premise data centers. Stateless is poised to capitalize on a growing opportunity as outsourced infrastructure revenue is expected to accelerate significantly through 2022, when it is forecasted to reach $382.63 billion up from $138 billion in 2018.”

Statelss Luxon is the first step in mission

Bob Laliberte, Practice Director & Senior Analyst, ESG says, “The business expectations of colocation and data center providers have changed as enterprises continue to decentralize workloads, and these providers must now position themselves as a primary hub for this tenant traffic. These providers can now leverage the Luxon platform to effectively scale their business and easily monetize additional connections in the data center.”

Murad Kablan, CEO and co-founder, Stateless says, “Luxon is the first step in Stateless’ mission to provide users with the power to create simple, customized networks on demand. The platform provides colocation and cloud MSPs the real-time agility they need to adapt to ever-changing business requirements and priorities.”


April 29, 2019  6:42 PM

1000 XSeries is a powerful protocol analyzer @Keysight

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Measurement, Testing

This is the concluding post of the 3-series interview with Sandeep Kapoor, EMEAI & Marketing Manager India, Keysight Technologies.

The first post is here. The second post can be accessed by clicking here.

6. What is Six-in-one instrument integration?

The six-in-one instrument integration are Oscilloscope + Function Generator (20MHz with modulation) + Hardware based serial protocol trigger and decode + Frequency Response Analyzer + Digital Voltmeter + Frequency Counter.

The 1000 X-Series offers an integrated 20 MHz function generator with a signal modulation capability. It’s ideal for educational or design labs where bench space and budget are at a premium. The integrated function generator provides stimulus output of sine, square, ramp, pulse, DC and noise waveforms to devices under test Add modulation to the signal with customizable AM, FM and FSK settings. It is an oscilloscope with an integrated function generator.

The 1000 XSeries is a powerful protocol analyzer that can do powerful decode and hardware-based triggering that enables specialized serial communication analysis. Other vendors’ oscilloscopes use software post-processing techniques that slow down the waveform and decode update rate, but the 1000 X-Series has faster decoding by using hardware-based technology that enhances scope usability and the probability of capturing infrequent serial communication errors.

The 1000 X-Series’ frequency response analyzer capability is the perfect tool to help students understand the gain and phase performance of passive LRC circuits or active opamps. This capability is achieved with a gain and phase measurement versus frequency (Bode plot). Vector network analyzers (VNAs) and low-cost frequency response analyzers are typically used for these measurements, but now an easy-to-use gain and phase analysis is possible by utilizing the 1000 X-Series’ built-in WaveGen.

The 1000 X-Series has an integrated 3-digit voltmeter (DVM) inside each oscilloscope. The voltmeter operates through probes connected to the oscilloscope channels, but its measurement is decoupled from the oscilloscope triggering system so both the DVM and triggered oscilloscope measurements can be made with the same connection. AC RMS, DC, and DC RMS can be quickly measured without configuring the oscilloscope capture. The voltmeter results are always displayed, keeping these quick characterization measurements at fingertips.

There is an integrated 5-digit frequency counter inside each oscilloscope. It operates through probes connected to the oscilloscope channels, but its measurement is decoupled from the oscilloscope triggering system, so both the counter and triggered oscilloscope measurements can be made with the same connection.

Additional Information

· More information about the Keysight InfiniiVision 1000 X-Series oscilloscopes is available at https://connectlp.keysight.com/FindYourLocation_1000Xoscilloscope.

· Electronic media kit, including images and additional supporting quotes, is available at https://about.keysight.com/en/newsroom/mediakit/infiniivision-1000x/.

· Information about the company’s complete line of oscilloscopes is available at https://www.keysight.com/find/scopes.


April 29, 2019  6:35 PM

Oscilloscope base model starts from 70MHz to 200MHz @Keysight

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Applications, Measurement, Wireless

This is part 2 of the series. You can read part 1 here.

4. What are the roles of comprehensive lab guide and oscilloscope fundamental slide set?

The Educator’s Oscilloscope Training Kit provides an array of built-in training signals so that electrical engineering and physics students can learn what an oscilloscope does and how they can perform basic oscilloscope measurements. Also included in the kit is a comprehensive oscilloscope lab guide and tutorial written specifically for the undergraduate student. Keysight also provides a PowerPoint slide-set that professors and lab assistants can use as a pre-lab lecture on oscilloscope fundamentals. This lecture takes about 30 minutes and should be presented before electrical engineering and physics students begin their first circuits lab. Note that this PowerPoint slide-set also includes a complete set of speaker notes.

InfiniiVision oscilloscope

5. What do you mean by bandwidth upgradable via software license?

Oscilloscope base model starts from 70MHz but it can be upgraded to 100MHz or 200MHz as the hardware capability is already there. Customers can purchase the upgrade option anytime and enable it via a software license key. Investment friendly feature.

Bandwidth is often regarded as the single most important characteristic of an oscilloscope. Measured in Hertz, the bandwidth of your oscilloscope is the range of frequencies that your oscilloscope can accurately measure. Without enough bandwidth, the amplitude of your signal will be incorrect and details of your waveform might be lost. We help customers get fast, accurate answers to their measurement questions, that’s why we offer the largest range of compliance and debugging application-specific oscilloscope software. These applications are engineered to work with the oscilloscope to quickly and easily provide exceptional insight into the signals.

X-Series measurement applications increase the capability and functionality of Keysight Technologies, Inc. signal analyzers to speed time to insight. They provide essential measurements for specific tasks in general-purpose, cellular communications, wireless connectivity applications, covering established standards or modulation types. Applications are supported on both benchtop and modular, with the only difference being the level of performance achieved by the hardware you select. We provide a range of license types (node-locked, transportable, floating or USB portable) and license terms (perpetual or time-based).

Concluding post link: https://itknowledgeexchange.techtarget.com/quality-assurance/1000-xseries/
Continued »


April 29, 2019  6:24 PM

InfiniiVision 1000 X-Series Oscilloscopes From @Keysight Technologies

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Instrumentation, Testing

In the next three posts, we are in discussion with Mr. Sandeep Kapoor, AD MSM – EMEAI & Marketing Manager India, Keysight Technologies.

1. What are the key features of the new models of the InfiniiVision 1000 X-Series oscilloscopes?

InfiniiVision 1000 X-Series oscilloscopes are entry level instruments with professional-level capabilities. It has 50 MHz to 200 MHz, 2 or 4 analog channels and can see more signal detail with 50,000 wfms/sec update rate. It can make professional measurements, including mask, math, FFT, analog bus, and protocol triggering/decode. InfiniiVision 1000 X-Series oscilloscopes are engineered to give quality, industry-proven technology at unbelievably low prices. It gives professional-level functionality with industry-leading software analysis and 6-in-1 instrument integration.

Further, the two models, EDUX1002A and EDU1002G provide a quality education for students and prepare the industry with professional level instruments. The 1000 X-Series leverages the same technology as higher-end oscilloscopes, allowing students to learn on the same hardware and software being used in leading R&D labs. The built-in training signals enable students to quickly learn to capture and analyze signals.

InfiniiVision

2. How is it able to improve overall efficiency?

By virtue of multiple unique hardware specifications (fast update rate, high memory, bandwidth) and software analysis tools (Serial trigger, decode, offline software analysis, frequency response, etc.), it becomes one of most user-friendly tool for basic debugging as well as application specific test & measurement tool. Users can make automatic measurements without spending too much time which helps in improving efficiency.

3. How is it able to bring high-end technology in an affordable price range?

Keysight has its own R&D and fab wherein custom ICs or MMICs are designed and fabricated. This helps in leveraging specifihttps://itknowledgeexchange.techtarget.com/quality-assurance/oscilloscope/c technologies to multiple platforms. Also, Keysight has a special focus on Academia wherein we work closely with WW Universities and offer different solutions at an affordable price range under special initiatives.

The discussion continues in the next two posts…Next post link: https://itknowledgeexchange.techtarget.com/quality-assurance/oscilloscope/ Continued »


April 29, 2019  6:01 PM

Japan Means Quality China Means Mass Production

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
quality

Japan strives for quality in every aspect of life. Quality is on the top for any kind of production there. It is the quality that doesn’t allow any other factor to surpass it there. A lot of organizations follow Japanese quality management techniques and philosophies. But how many of those follow ethically and aesthetically? No Japanese company will compromise with the quality of its products. China, on the other hand, believes in mass production and price war game. It wants to be on the top in a price war. To be there, it doesn’t mind compromising with the quality of the product. While the Japanese product is for life-long use, Chinese product is use and throw. You can use Chinese product as long as it works. After that just throw it.

There is no concept of repair in a Chinese product. Since the price of Chinese products is so low, you can afford to throw it and buy a new one. Does it really make sense? For instance, you buy a Japanese product for $100 and it runs for more than a sufficient number of years without any major failures. On the other hand, a similar kind of product if you buy from China, you can get it for say $25. Now, if that Chinese product goes bad in a couple of years and is hard to repair or the repair cost is higher than the cost of a new product then obviously you will have no other choice than throwing the older one and buying a new one. So every couple of years you spend $25 which in the long run goes costlier than the Japanese product.

Quality versus mass production

While the Japanese product worked efficiently without any hiccups, it also saved your time and money. On the other hand, the Chinese product not only was substandard in quality, but it also wasted your time and money as and when it went out of order.


April 29, 2019  5:30 PM

Dr. Joseph M. Juran – Quality Guru Series – Quality Guru III

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Pareto principle, Quality management

Dr. Joseph Juran is known for his quality trilogy that consists of quality planning, quality improvement, and quality control. All three are sequential. This is a cyclic process. Because there is no end to quality improvement. The more you go deep, the more you think, the more ideas you get to improve it. For every improvement thus, there has to be a proper plan and execution. Mere execution doesn’t add value to life unless you find out ways to sustain those improvements. Those improvements, in fact, are not for one time. Their sustenance is more important. As a matter of fact, you can move to the next level only if you are able to sustain on the current level. Falling back below will cause deterioration.

Joseph Juran

Here are some famous quotes from Dr. Joseph Juran:

“All improvement happens project by project and in no other way.”

“Goal setting has traditionally been based on past performance. This practice has tended to perpetuate the sins of the past.”

“Without a standard, there is no logical basis for making a decision or taking action.”

It is most important that top management be quality-minded. In the absence of sincere manifestation of interest at the top, little will happen below.

Quality planning consists of developing the products and processes required to meet customer’s needs

Joseph Juran

A good rule in the organizational analysis is that no meeting of the minds is really reached until we talk of specific actions or decisions. We can talk of who is responsible for budgets, or inventory, or quality, but little is settled. It is only when we get down to the action words-measure, compute, prepare, check, endorse, recommend, approve-that we can make clear who is to do what.

Joseph Juran was born in 1904 and he died in 2008. He did a lot of work in the field of quality and quality management. He is known for advanced development of the Pareto Analysis that was founded by Vilfredo Pareto.


April 29, 2019  2:32 PM

Dr. W. Edwards Deming – Quality Guru Series – Quality Guru II

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Deming, PDCA, Quality management

Dr. W. Edwards Deming did an astonishing job by encapsulating his complete ideology of management in just 14 points. This was followed by 7 deadly diseases of management to balance both the sides of management. His main focus in life and profession was on the state of quality. That way, he started his journey of quality by picking up the concept of quality from where Dr. Shewhart had left. He is also known for adding advanced state of quality. In fact, he is the one who explained precisely about variation, control charts, and so on. As a matter of fact, Edwards Deming promoted and popularized the PDCA cycle by showcasing its importance in the business world. The quality world knows him for the Deming Cycle.

Some of the great quotes by Dr. Edwards Deming are as below:

It is not necessary to change. Survival is not mandatory.

If you can’t describe what you are doing as a process, you don’t know what you’re doing.

It is not enough to do your best; you must know what to do, and then do your best.

Experience teaches nothing without theory.

People are entitled to joy in work.

Learning is not compulsory… neither is survival.

The idea of a merit rating is alluring. The sound of the words captivates the imagination: pay for what you get; get what you pay for; motivate people to do their best, for their own good.

The effect is exactly the opposite of what the words promise. Everyone propels himself forward, or tries to, for his own good, on his own life preserver. The organization is the loser.

The merit rating rewards people that conform to the system. It does not reward attempts to improve the system. Don’t rock the boat.

Dr. Edwards Deming

Dr. Edwards Deming was born on October 14, 1900, in Sioux City, Iowa, United States. He died on December 20, 1993, in Washington, D.C., United States.


April 29, 2019  2:15 PM

Dr. Walter Shewhart – Quality Guru Series – Quality Guru I

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
PDCA

Dr. Walter Shewhart is known among the pioneers in Quality. He is among the top few people who made the world understand what Quality is and why is it so important to thrive and strive in the business world. Dr. Walter, in fact, is the one who developed the concept of Plan, Do, Check, Act or the PDCA Cycle. Some organizations alternatively call it Plan=Do-Study-Act or PDSA. But conceptually both are one or the same. Dr. Walter also designed theories of process control and the globally acknowledge Shewhart transformation process.

Here are some of the top quotes by Dr. Walter Shewhart:

Dr. Walter Shewhart:

Postulate 1. All chance systems of causes are not alike in the sense that they enable us to predict the future in terms of the past.

Postulate 2. Constant systems of chance causes do exist in nature.

Postulate 3. Assignable causes of variation may be found and eliminated.

Both pure and applied science have gradually pushed further and further the requirements for accuracy and precision. However, applied science, particularly in the mass production of interchangeable parts, is even more exacting than pure science in certain matters of accuracy and precision.

Rule 1. Original data should be presented in a way that will preserve the evidence in the original data for all the predictions assumed to be useful.

Every sentence in order to have definite scientific meaning must be practically or at least theoretically verifiable as either true or false upon the basis of experimental measurements either practically or theoretically obtainable by carrying out a definite and previously specified operation in the future. The meaning of such a sentence is the method of its verification.

Dr. Walter Shewhart

Dr. Walter Shewhart was born on March 18, 1891. He died on March 11, 1967, at the age of 76. He was an engineer by occupation. A lot of things happened in this field but the basis that he established like PDCA and process control remains as valuable today as at that time.


April 28, 2019  11:04 PM

How Smart Are Latest Smart Devices? #SmartDevices

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Intelligent device, Smart Device, smart home devices

In a recent mega launch of smart home appliances by one of the top global brands in India, I was thinking about how smart really are these smart devices. Who defines their smartness and to what extent that impacts our daily life as an end consumer. The various intelligent devices launched were smart TVs, washing machines, air conditioners, headphones, etc. Now if we talk about smart televisions, the key emphasis on various features to ascertain it as smart were its huge size, excellent picture quality, slimness, huge content it carries, multiple languages support, and so on. Well, all this is good especially the content that I can choose as per my personal choice. That personalization is good but is that enough that I would need. Can it cross certain barriers and become smarter, or in fact, real smarter. For instance, can it allow bookmarks, content highlight,etc.?

Similarly, when it comes to Washing Machines, can a washing machine be intelligent to the extent that the water it uses in the first cycle gets purified within the machine in a separate segment and the same water is used in the second cycle. The same process again purifies this water and the machine uses it for the third cycle. In fact, after the final cycle, the same water gets collected in a bucket, detergent-free, that I can use to water my plants? Wouldn’t it be more eco-friendly? On the same notes, I was thinking, why not my smart headphones charge when I jog or exercise converting that energy for the purpose of charging. Something like a self-charging phenomenon. Air conditioners, for instane, consume a large amount of electricity. Can there be a mechanism where they run with least consumption of electricity? And throw least heat?

Smart Devices need to be more intelligent to address global concerns

All above thoughts are for intelligence in my view. In the name of intelligent devices, there is a huge scope. But every smart devices feature must be there for a cause addressing a global concern. Not, instead for the fancy and fantasy of consumer.


April 26, 2019  10:13 PM

Can Technology Turn Sour Customer Experience To Sweet One?

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Automation, customer experience, Technology

This is a real-life example of the world’s largest e-retail enterprise operating in India. My purpose in quoting this example is just to showcase how a small bit of absence of technology can become a big differentiator. Since it is an online store, I just log in my account on my laptop or smartphone through its desktop app or mobile app and do the needful. Everything is so smooth. Finding what I want, ordering, and getting confirmation of delivery on such and such date. Fantastic. In fact, that is the reason for my sticking to this online hyper retail store. Technology is fully leveraged and things are flawless so far. The processes are well defined and automated. That is the reason that is able to handle millions of customer concurrently at any moment of time. A similar number of orders in a day happens.

customer experience

Photo credit: RG&B on Visual hunt / CC BY-NC-ND

Every delivery is recorded immediately by the delivery person. I, as a customer, immediately get an alert that the item has been delivered asking my feedback and how was my experience. Well, the first shortfall in this feedback is it just highlights options, do the rating, and submit it. There is no place to enter remarks. The incident in my case is of a false delivery reporting. The delivery person called me one evening informing me that he is to deliver a bundle. Nobody was at home and thus I requested him to come back the next day. He argued if he could hand it over to anybody a floor down or above, or to the security person, that I denied. The delivery person accepted to come the next day, at least to me over the phone.

Customer Experience and Technology

Surprisingly, within the next 2 hours, I get an alert that the consignment has been delivered. Shockingly the status showed it has been received by me in person. Can technology not do a lot in this case thus making this incident a happy ending story?


April 26, 2019  9:53 PM

Last Mile Technology Proves To Be The Biggest Differentiator

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Automation, Technology

Digital transformation is a buzzword for many CIOs, CTOs, and CXOs in most of the enterprises. In the name of technology, they might have a lot in place. But, in my opinion, what matters most is the last mile technology that defines your maturity and success. Having SAPs, CRMs, and other enterprise apps in place are fine. What about where it is needed most. The endpoint. An instance where your actual consumer or customer shakes hand with your last bit of face. Let me elaborate it a little by a few real-life examples. There is a top government-run milk and milk products company in India having excellent technology framework and ecosystem, as its CIO claims. It is a decades-old organization. Besides selling milk and dairy products they also have outlets for selling vegetables, fruits, pulses, etc. The number of these outlets is huge.

last mile technology

Photo on VisualHunt

What lacks in this organization is a mechanism to automate their billing and inventory. With a little automation, it could easily identify the balance of items at any moment of time. This they have recently done though it is too late and is still full of a lot of glitches. More than this is important to know the pulse of the local consumers who come to these stores. That includes me for the store that is near to my residence. When I need a specific vegetable at this store, I get to know that either it didn’t replenish today or is finished. Both are quite vague reasons in this world of technology. The worst part I have seen is that this local store that belongs to that organization does a lot of manipulations. Let’s look at real-life instances here.

Where is the last mile technology?

For example, the store owner orders 20 kilograms of fresh and good quality pumpkins from the warehouse. He gets the same quantity and quality, say in the early morning hours. Within 10 minutes all this fresh lot is shifted to the local vegetable vendor and a lower grade same quantity pumpkins is replaced with it. Now, this will sell at a high-quality rate where in reality, the buyer is getting lower quality material. And there is no check and control mechanism here. Where is the last mile technology? And where are the CXOs of this organization living far away from the real fruits what technology could reap for them?


April 19, 2019  11:00 PM

TensorFlow Plug-In Enhances Machine Learning Capabilities @Quobyte

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Distributed File System, Machine learning

TensorFlow plug-in recently launched by Quobyte for its distributed file system proves to enhance throughput performance by 30 percent for Machine Learning (ML) training workflows. Probably this is the first of its own kind of plug-in for a proven enhancement in machine learning capabilities. Quobyte is among the top developers of modern storage system software. The Quobyte Data Center File System is the world’s first distributed file system to offer a TensorFlow plug-in that empowers its customers with increased throughput performance with linear scalability for ML-enabled enterprise applications. This, in turn, enables faster and uniform training across larger data sets. Obviously, it automatically ensures higher-accuracy outcomes thus enhancing business’ decision-making capabilities. Any business decision loses its sanctity if not taken in time. A delayed business decision, obviously, causes damage to business growth. The plug-in, in fact, is an open source library for numerical computation.

TensorFlow

Source: Quobyte.com

TensorFlow fits all businesses as it is an open source library. Further, with its numerical computation capabilities handling large data sets, it promises to evolve large-scale machine learning that is already setting its root among various technology segments across industries and institutions like autonomous vehicles, financial services, robotics, government, defense, aerospace, and so on. The scope is endless. It is only how much an industry can leverage its power to enhance decision-making capabilities. With the help of Quobyte storage and TensorFlow, any industry vertical can simplify and streamline their operation of machine learning. The plug-in enables TensroFlow applications to communicate directly to Quobyte by bypassing the operating system kernel. This substantially reduces kernel mode context switching which in turn ensures lower CPU usage. The best part is Quobyte storage can work with all stages of ML thereby increasing GPU utilization from the TensorFlow plug-in.

TensorFlow Is An Open Source Library

Apparently, increased GPU utilization from the TensorFlow plug-in significantly improves model training of ML workflows. Users gain the flexibility to train anywhere. The models can move flawlessly into production. Since the plug-in has nothing to do with the kernel, it works with any version of Linux providing a high amount of flexibility in deployment for use in ML. As a matter of fact, you don’t require any kind of application modifications while using Quobyte TensorFlow plug-in.

Frederic Van Haren, Lead Analyst HPC and AI Systems of analyst firm Evaluator Group says, “As more and more businesses look to leverage ML to increase innovation, achieve a faster time to market and provide a more positive customer experience, there is an increasing need for storage infrastructures that offer higher performance and increased flexibility that these workloads need. Vendors, like Quobyte, that offer high performance, broad platform support and flexibility of deployment options are well positioned to help companies handle bigger data sets, achieve more accurate results and run ML workloads in any environment.”

Bjorn Kolbeck, Quobyte CEO says, “By providing the first distributed file system with a TensorFlow plug-in, we are ensuring as much as a 30 percent faster throughput performance improvement for ML training workflows, helping companies better meet their business objectives through improved operational efficiency. With the higher accuracy of results, scalability to handle bigger data sets and flexibility to run on-prem to the cloud, and edge, we believe we are providing an optimal experience that allows customers to fully leverage the value of their Machine Learning infrastructure investments.”


April 11, 2019  8:50 PM

Cynet Threat Assessment Program – Key Insights @Cynet360

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Cyber security, cybersecurity, enterprise cyber security

Cynet is among the global leaders in automated threat discovery and mitigation. If you are an organization with 500 or more endpoints then it becomes important for you to understand this threat assessment program. One of this program’s agenda is to offer free cybersecurity threat assessment. Since there is no upper limit of endpoints for this free assessment, it means even the larger organizations can get the advantage of it. Of course, the risk for any organization depends on two main factors. One, the size of business. Two, the type of business. The larger the size of the business, the larger is the risk. The more critical is your business, the more severe it becomes to ensure your online system’s availability. If I compare Amazon and Walmart, Amazon having no physical stores can’t afford even a few minutes downtime of its systems. A few minutes will cost billions.

Cynet

Source: Cynet

When it comes to cybersecurity, threat assessment is equally important for all businesses. Every business whether operating online or offline has an external exposure. The business might not be online for many organizations but their various locations accessing various enterprise solutions must be having online exposure. That is why for every business, having its true security posture check is important. It is equally important to assess profiles risk exposure relative to industry peers. Obviously, if you are at a higher risk than your industry peers, your business will have a higher risk of getting diverted to them.

Cynet offers free threat assessment

As every customer want peace of mind, a business having the least risks will be the first choice for any customer. The cynet Threat Assessment program is for organizations with more than 500 endpoints to undergo this assessment program for free and evaluate and identify critically exposed attack surfaces. That’s the first point of realization.

Once it is done, only then you are in a position to understand actionable knowledge of attacks that are currently alive and active in the ecosystem. In comparison to Cynet Threat Assessment program, any other assessment would be costly and time-consuming. The program needs hardly 72 hours to run a complete assessment with zero out of pocket expense. A recent report ‘Ninth Annual Cost of Cybercrime Study’ published in March 2019 by Accenture and the Ponemon Institute says, “The cost of malware and malicious insider cyberattacks grew 12% in 2018 compared to the previous year. The former now cost U.S. companies an average of $2.6 million annually and the latter $1.6 million. The combined totals equate to one-third of the $13 million average cybersecurity costs to companies, which is $1.3 million more than in 2017.”

As a matter of fact, Cynet has changed the rules of the game altogether by offering a free threat assessment. This free assessment is important for an organization to benchmark its security posture against peers in their business stream. With a correct assessment, an organization can implement remedies quickly and accurately and mitigate the risk to enhance business outputs. The key advantages of Cynet Threat Assessment program are Indications of live attacks, User Identity attack surface, Hosts and apps attack surface, Peer Benchmarking, and Ranking.

Cynet changes the whole paradigm of enterprise threat assessment

Eyal Gruner, CEO, and co-founder of Cynet says, “It is becoming increasingly common to find that organizations are already playing host to malicious activity at varying degrees of attack when we come to deploy our platform. Typically, organizations underestimate the attacker’s ability to silently operate, helping these criminal operations to be successful. With the Cynet Threat Assessment service, we are taking a proactive approach to discover the active risks and remove them from the environment.”

In the nutshell, any organization with more than 500 endpoints can register for free here https://go.cynet.com/free-threat-assessment.


March 31, 2019  9:48 PM

Ludmila Morozova-Buss On Cyber Security and Digital Transformation

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Cyber security, Digital transformation

“A word ‘unprecedented’ seems too weak to convey just how much the dimensionless operational space of digital (r)evolution requires an instantaneous reaction.”

“A central topic of my essays is cybersecurity.
A fundamental and delicate question at the heart of my work is: how to motivate my readers to want to learn more.”

“People and organizations need to trust that their digital technologies are safe and secure; otherwise they won’t embrace the digital transformation.

Digitalization and cybersecurity must evolve hand in hand.”

“I advocate a Systems Thinking approach in educating our readers, followers, friends, business associates on digital transformation, emerging technologies and cybersecurity.

Systems thinking forever changed the way I think about the world and approach issues.

An open immeasurable non-linear system – the Cyber Space, where cyber threats and cybersecurity are two of many (to be defined) elements of this system.”

“The discipline of systems thinking is more than just a collection of tools and methods.

Systems thinking is a philosophy and a methodology for understanding behavior of complex dynamic systems.”

“Cybersecurity is becoming the most important security topic of the future – particularly in the age of digitalization.”

“Why is cybersecurity so hard?

o It’s not just a technical problem

o The rules of cyberspace are different from the physical world’s

o Cybersecurity law, policy, and practice are not yet fully developed

o There’s not enough manpower in the world to make sure networks are 100% secure 100% of the time, especially with the prevalence of a cloud-based infrastructure.”

Ludmila Morozova-Buss

“With hope to create and scale globally an inclusive ‘authors-publisher-readers’ circle of wisdom and expertise; with channelled determination to gain understanding by carefully selecting the best information sources (Dis Moi où Cherche! Mais où?) and reading between the lines, multiplied by expressed interest for knowledge sharing by the industry experts, and as part of my ‘Top Cyber News’ extended roundtable series; I brought in one-of-a-kind ‘Men on the Arena’: Charles (Chuck) Brooks, Stewart Skomra, Mike Quindazzi, and Scott Schober to create a series of articles ‘The Globality Quotient: Cybersecurity’ published by Dennis J. Pitocco, BizCatalyst360 – an award-winning digital magazine.

My Articles:

The Globality Quotient: Cybersecurity

Cybersecurity – Prevention And Protection.

Cybersecurity – What is Ethical Hacking or a Hacker is a Hacker.

Cybersecurity “Hacked Again” & Women in Digital Universe

Women in Cybersecurity: Why Closing the Gender Gap is Critical via TechNative”


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: