Quality Assurance and Project Management


January 31, 2019  11:29 PM

Zoho Creator Brings The Real Low Code Platform @Zoho

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
Application development, Software development

When you hear or read low-code, what comes to your mind in the first go? Is it like you have to bend low and write code? Well, jokes apart, the technology has reached a level where you can develop a low to the mid complexity application with the help of Zoho Creator Low Code Platform. And for this, you, in fact, don’t need to be a hardcore developer. You don’t even are required to be an expert in any of the development language. You just need two basic things for becoming an expert in Creator. One, some ground-level command knowledge. Two, training from an expert, maybe, from Zoho or from someone who is already an expert in this particular technology. Another interesting part is that you can integrate this new piece with an existing application.

Low Code Platform

Source: Zoho.com

So, for instance, you have SAP running as a core business application in your organization. And a new requirement comes from a user group asking to develop new functionality. There are two ways of doing it. One, call SAP experts, pay them some hefty per day cost, and get it done. Another way is to develop the piece in Zoho Creator Low Code Platform by calling all the essential data from existing SAP database and then pushing back the result in SAP to further flow in various business processes. That calls for development separately in Creator, and a backward and forward integration with SAP with no new tables creation even. Or it may be a call for creating a couple of interim tables to store data and then push that to SAP tables. That means a lot of relief in terms of money, time, and manpower.

Low Code Platform Is A Reality Now

As a matter of fact, Zoho Creator can be learned by non-IT persons also. It just needs the person interested to learn it to have good business knowledge and a basic interest in learning a few fundamental things about Creator. Then it is merely a matter of practice and applicability.

January 31, 2019  9:41 PM

ASUS Launches World’s Smallest Notebooks @ASUSIndia

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
ASUS, notebooks

Now, this is what I call as real innovation and customer-focused innovation. Otherwise, a lot of organizations keep claiming a lot in the name of innovation but in actuality, that is of no use. ASUS launched the world’s smallest notebooks in 13″, 14″, and 15″ segment named as Zenbook 13, Zenbook 14, and Zenbook 15. These are ultrathin having four-sided unique technology called “NanoEdge” displays. The new NumberPad is not at its usual place. It is uniquely and innovatively placed on the touchpad. So, now, the NumPad and touchpad are residing at the same location and the touch is intelligently recognized by the laptop whether it is intended for Numpad or touchpad. Effectively, the touchpad has become a multilayer model. It leaves a scope of better and larger keys and keyboard size increased significantly. It also enhances productivity and let the user work with a better pace and concentration.

smallest notebooks

The login in these zenbooks (world’s smallest notebooks) is through a powerful 3D IT camera that works fine even in the low-light environment to recognize the face and allow the user to log in. There is an Ergolift hinge to raise keyboard at the rear only when you open the laptop to use. It actually helps in comfortable typing. Technically and design wise it also helps in improving the cooling and audio performance. It is powered by 8th Gen Intel Core CPUs along with GeoForce Graphic Cards. The laptop allows users to access gigabit Wi-Fi. These are just a few of the features. Actual revolution is in its design that is definitely mindful and user-centric. As a matter of fact, Zenbook 13 is smaller than an A4-size sheet. That is phenomenal. The numeric keypad is LED-illuminated. It gives a different kind of feel to the user.

Smallest Notebooks are the latest Zenbooks from Asus

The lightweight Zenbooks that are world’s smallest notebooks are in reality a powerhouse with unmatched quality and design.


January 31, 2019  9:12 PM

The Dark (Other) Side Of CIO / CTO Of An Enterprise IT

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
CIO, CTO, Enterprise IT

Chief Information Officer is the chief custodian of the information of an organization. Mostly his role is to take care of the digital information but when his role combines with that of a CISO (Chief Information Security Officer) the physical information comes into his vicinity. Similarly, CISO as a separate role has to ensure the right kind of measures to be in place to ensure the safety and security of any kind of organizational information. Information scrutiny and information flow scrutiny is fine that he can create an appropriate process and ensure strict adherence to those processes. But when it comes to the safety of information in connection to the employees or external stakeholders then is he responsible to scrutinize those people too? I think yes. So basically, in that case, a background check of a new recruit also becomes essential.

Enterprise IT

Photo credit: Symic on Visualhunt / CC BY-SA

Recently in an online technology magazine, there was a news about a CIO of a large retail organization about the new launch of their mobile initiatives. That was a very basic kind of mobile app that was launched. The way it was being publicized was something not matching with the initiative. A very small initiative was being projected as something extraordinary. That is synthesized news. Induced one. I put a comment below that article that this small thing should have been done two decades ago in such an old and large organization with such a large IT setup. This is a tragedy that technology heads making a mockery of technology. Actually, organizations have no criteria to measure the intellectual and monetary loss of non-automation of a business critical process. Something that could have been done years back in an organization, if stays uninitiated or under process for years denotes lethargy.

Another example is of a CIO sacked in an organization for financial fraud. He was booked for taking money from vendors for few of the big deals happening in the organization. He was not sacked actually but was told to put down his papers and then was told to move out immediately. Today he is CIO of another large organization. Would be playing similar kind of games. Sad thing is that organizations recruiting C-suite people sometimes don’t come to know about these darker sides of their personalities.


January 31, 2019  6:58 PM

Five Pillars Of A True Hyperconvergence Software – II @MaxtaInc

Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
hyperconvergence, Maxta

This is the concluding post on our discussion with Barry Phillips, CMO, Maxta Inc. The first post is Software Model Versus Appliance Model – Which Business Model? In the second post, he tells how Inflexible Architecture Will Create Much Bigger Issues For IT. In the third post, that is the previous post, he elaborates the first two essential passing criteria of a hyperconvergence software. Let us conclude the series with the rest of the three important parameters.

  • 3. How easy or difficult it is for IT of an organization to add capacity within the server?
  • Barry asks – “Can you add capacity within the server? The only way to add capacity with an appliance vendor is by adding another appliance. Even though some vendors offer a storage-only node, the step-up cost of another “pizza box” isn’t trivial. With true hyperconvergence software, enables you can add capacity to an existing server by adding drives to open slots, swapping in higher capacity drives, or by adding servers. If you can only add capacity by adding nodes, you have a fake software model.”

    True Hyperconvergence Software

  • 4. Licensing:
  • Barry says, “Are you being forced into the same appliance software licensing model or do you have a choice? Hyperconverged appliances tie the software license to the appliance, so when you refresh your hardware you get the privilege of repurchasing the software. This is a “term license,” which means you get to buy the software over and over again, and it’s the only option you have in a fake software model. While many software companies are starting to offer term licenses to provide subscription-like pricing, nearly all software companies still offer a perpetual license that you own forever. You should have a choice of perpetual or term licensing. Do you like the thought of owning the software for life, but don’t want to pay for it all upfront? Just lease the software from any number of leasing companies. It gives you the best of both worlds.

  • 5. Memory and CPU Resources:
  • Barry concludes – “Can you add more memory and CPU resources? Just like adding storage capacity, you should be able to add additional memory or compute whether inside an existing server or by adding a compute-only server. A true hyperconvergence software model scales storage independent of compute. A fake hyperconvergence software model operates the same way as the appliance model.”


    January 31, 2019  6:34 PM

    Five Pillars Of A True Hyperconvergence Software – I @MaxtaInc

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    hyperconvergence, Maxta

    This post is third in series in continuation to previous two posts on Barry Phillips of Maxta Inc. talking about the five essential components of hyperconvergence software. You can read the first post by clicking here and the second post by clicking here. Any hyperconvergence software that doesn’t fulfill the following criteria is not a true hyperconvergence software. Who can tell it better than Maxta Inc? The five important criteria are:

  • 1. Does existing server hardware support new software?
  • Whenever there is new software to be put in production, do you need to buy new hardware? Every time? As Barry Phillips, CMO, Maxta Inc. says, “Can the software be installed on your existing server hardware? This is the first sniff test of whether it is a true software model or a fake software model. Of course, you need to make sure the hardware has the right specifications to run the software, but you shouldn’t need to buy new server hardware. And don’t get fooled by the old trick of being able to run “trial” software on your own hardware, but you have to buy new hardware to put the software in production. True infrastructure software vendors like Microsoft, Citrix and VMware do not make you buy new hardware to run their software.”

  • 2. Is hyperconvergence implementation dependent on a certain set of server SKUs?
  • Barry questions, “Does your server hardware have to be from an approved list of server SKUs? And then elaborates it saying, “If you do want to refresh your hardware when you implement hyperconvergence, does the hyperconvergence software vendor limit you to a certain set of server SKUs? If so, that isn’t really software; it’s just an appliance vendor separating out the appliance software from the confined set of appliance hardware.”

    True Hyperconvergence

    The basic question is there are a lot of vendors in the market giving a different kind of hyperconvergene solutions. Do they really provide a true hyperconvergence environment? Do they fulfill above to criteria? Let us have a look at the other three criteria in the next post.


    January 31, 2019  6:11 PM

    Inflexible Architecture Will Create Much Bigger Issues For IT

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Business model, hyperconvergence, Maxta

    In continuation to my previous article on Software model taking drastic precedence over the appliance-based business model, let us try to encapsulate five essential properties of a true hyperconvergene software. Are most of the ITs still creating Inflexible Architecture? Barry Phillips, CMO, Maxta Inc. says, “Once that appliance-based product has taken off, the company will want to change to a software business model from a profitability perspective. This can be a difficult pivot to make financially since revenue decreases before profitability improves, and it changes how the sales teams are paid. If the pivot is made successfully, then the company is much more profitable and financially stable”.

    Barry adds further, “Even if a pivot to software works out for the vendor, it does not always work out well for the customer – especially if the software model is an appliance “in software clothing.” If you’re considering hyperconvergence software, make sure it’s not an appliance in disguise. Many vendors will claim to offer hyperconvergence software, but still significantly restrict how their solution can be deployed and used. Ask vendors these questions to determine how much (or how little) flexibility you’ll get with their software.” “As the hyperconvergence market shifts from appliance offerings to software, vendors that started out selling hardware platforms will need to shake both the appliance business model and the appliance mentality. As you evaluate hyperconvergence, always understand what limitations and costs will be in four or five years when you need to refresh or upgrade”, he continues.

    Talking further, Barry adds, “Infrastructure platforms are evolving quickly, so the ability to scale, choose and change hardware platforms, and use different hypervisors will certainly make life easier. Getting locked into an inflexible architecture will create much bigger issues for IT down the road. By asking the right questions upfront, you’ll be able to navigate the changing landscape.”

    We will continue with Barry’s ideation on Hperconvergence Software in next post. Continued »


    January 31, 2019  5:47 PM

    Software Model Versus Appliance Model – Which Business Model?

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Business model, hyperconvergence, Maxta, software model processes

    Who can understand Hyperconvergence Software better than Maxta Inc? We have already covered an article this previous month. You can read that by clicking here. Let us understand that further a little deeper and try to understand what are the five basic requirements for hyperconvergence software. Before that let us get into some basics. Are financial analysts clear about the concept of an organization switching from their appliance model to a software model? That is making, in fact, stock prices soaring for whatsoever reason. And the fundamental reason is the software model itself. When you try to find out the pros and cons of a business model between the two, obviously software model takes a larger leap over appliance model. But if the software is a brighter business model, than why companies still keep sticking to shipping appliances? One neatly needs to understand the whole gamut behind it.

    Business Model

    Photo credit: vistavision on VisualHunt.com / CC BY-NC-ND

    Obviously, selling appliances is much easier through any channel. Directly or through a distributor and reseller network. On the same note, when it comes to software, it is equally easier to build an application for a specific or a specific set of hardware platforms. Of course, it is not difficult to support a limited number of hardware platforms. But then this kind of design will have a lot of limitations that will invite a large number of troubles. The most important thing is that such software can’t find its place among universal acceptance. That is the basic issue Maxta tries to overcome for any size of organization launching or using any kind of software. That is the most critical differentiator between the appliance-based business model versus a software business model. Most of the organizations plan to change to a software business model for higher profitability.

    Appliance model is becoming an obsolete business model

    Appliance-based model is becoming obsolete because of hyperconvergence software having higher capabilities.

    We shall continue this discussion in the next few articles Continued »


    January 29, 2019  11:06 PM

    Violin Systems Excels In Extreme Performance Enterprise Storage

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Enterprise storage, Storage, Violin Systems

    Any kind of disruption creates two reactions. One, fear out of which the players become defensive and start stepping back. Two, a very few players look at it as a new pool of opportunities and start exploring various innovative ways to cater to it. Most of the players in the former category start getting into a shell and become history sooner or later. They keep sitting on their laurels achieved in the past. Because they didn’t accept to participate in the new game of warriors and hence have nothing to prove in the newer battlefield of business. Most of the players in the latter category succeed despite swimming upstream because of two latent forces coming from within. One, courage. Two, innovative ideas taking a shape of reality. Violin Systems very distinctly stands apart as a spearhead in this category. Let’s see what makes them a class apart in technology.

    Violin Systems is a synonym to extreme performance enterprise storage. And that is provided at the price of traditional primary storage. The sole aim is to empower enterprises to get the maximum leverage of their business-critical data in a manner that was never thought of by any of the technology players across the globe. The solution provides lowest-latency and highest IOPs that is unmatched. This includes all kind of seriously essential data services like data-protection, data reduction, and business continuity, to name a few. Businesses can easily bank on Violin Systems for achieving a new level of enhancement in their application performance with extreme reliability thus taking their business service levels to newer heights along with reducing costs drastically. Immediate access to information is an organization’s top dream because that is the only key to achieve higher revenue and gain a substantial increase in customer satisfaction.

    Violin Systems is a synonym to extreme performance enterprise storage

    In today’s scenario which organization in the world would not like to be a data-driven business. Violin Systems helps enterprises drive their business-critical applications to support operations, quality, and delivery across their entire stakeholders’ ecosystem. It also helps enterprises to easily scale and extend their competitiveness thus staying ahead of the others in the fray. That is the reason enterprise customers reply on Violin Systems for unmatched extreme performance and excel to drive their business without any compromise.


    January 13, 2019  10:07 PM

    Role Of Developers: Test As You Build Is The New Mantra

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Software test design, Software testing, Testing

    There is a constant shift in the role of developers globally. Actually, it is not a shift completely. It is, in fact, an additional kind of role that is embedding within their existing role of coding and development. It is to test as you build. The new mantra is to test while you develop. Obviously, most of it would be manual testing of small pieces of codes being built. Almost like a unit testing or a segment testing. Now, this doesn’t require any additional skillset in developers. What they have to do is to just test what they are building. It is, rather, a small shift in the mindset only. A developer has to first convince himself that it is very well his job to perform it. One, because it is his own code that needs to be 95%, if not 100%, matching with the business requirements.

    It is a kind of building an assurance along with the coding as an additional part of the role of developers. It is not that a shift is happening only in the developer’s role. Testers also are facing an altogether similar kind of overhaul in their roles. The scope of testing earlier and testing now is changing in a big way. In fact, the ultimate goal of any testing is to increase quality and hence self and customer confidence. Testing and development are getting closer as never before. And this is proving to be a smarter and efficient move. While developers are doing all the testing themselves, testers are supposed to become automation engineers in the changing scenarios. Testers doing traditional kind of manual testing as before is a big NO now for most of the progressive organizations.

    Role of Developers Takes A New Turn

    Earlier testers could exist merely having manual testing skills without a few technical skills and banking completely on their functional knowledge. But now it is not possible. As the role of developers is changing, so is of testers. Testing has become more demanding thus getting more penetrative and effective. The major onus of this shift goes to DevOps and Agile. Testing needs to be in the mainstream of project lifecycle at the earliest and more frequently thereby giving fruitful results faster. Of course, no testing is complete without a human touch of discovering the unknown with the help of their intuitive exploration. Wishing a piece of good luck to all developers and testers to adopt their changing roles for the goodness of quality of the product you are developing and testing.


    January 13, 2019  4:58 PM

    Test Automation Tools: Changing Trends In Software Testing In 2019

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    automation tool, Software testing, software testing automation tools, Software testing tools, Test Automation

    Any organization in software development and still not test automation tools and completely banking on manual testing can’t stay in the mainstream software business. Even if we talk about an enterprise in any business vertical other than software testing and have a focus on in-house development for any kind of key business application can’t release a healthy product merely on the basis of manual testing. The reason for that is not that manual testing is incapable of testing a product fully. The reason is that in today’s scenario, software applications don’t run on a single platform, hardware, and operating system. Any application you develop, or for that procure from an external vendor, the first and foremost requirement would be its capability to run on multiple platforms like a laptop, tablet, smartphone, and desktop.

    Test Automation Tools

    Photo credit: Michael Kappel on Visual hunt / CC BY-NC

    This automatically calls for a heterogeneous spectrum of a number of operating systems and different environments. Now that is obviously a big challenge if you have to perform all these testings manually. One way is to have a huge size of manual testers. Another way is to use test automation tools to save a huge spend on resources, manpower, and time. These test tools simulate various environments, loads, operating systems, and capacities. In my opinion, to be in the best situation, the manual versus automated testing ratio should be 30:70 at least. Of course, it can’t be 100% automation. There are certain things that still can be managed manually only. Another way is to outsource a reliable company for testing. This way you can live with limited resources in the testing department.

    Test Automation Tools Demand Has Increased Tremendously

    But ensure to place a strong SLA with this outsourced testing organizations in terms of information and outcomes. I doubt if there is anybody at customer end accepting a software without proper testing reports. There is a substantial increase in test automation tools demand in the global market for this reason. That easily shows the change in testing trends.


    January 13, 2019  3:32 PM

    Are You An Expert In Exploratory Testing? Check It Again!

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Quality assurance, Testing

    If you are in testing or are a part of software testing ecosystem, you must be very well aware that what it takes to become an exploratory tester. In testing fraternity, not every tester is a good exploratory tester. It requires, in fact, a set of certain skillsets to be the one. The journey doesn’t stop just by acquiring these skills. It has to be an exhaustive enhancement of these skills with continuous learning, experimenting and exploring. Exploratory testing, as a matter of fact, is not everybody’s ballgame. Let us talk about some special skills that if you require can make you a superb exploratory tester. Here we go:

    Explortory Testing

    Photo credit: NRCgov on Visual hunt / CC BY

    1. Critic:
    You have not to be critic about everything that happens in life. But when it comes to testing a piece of code or a complete application, you should not accept what code says about it. You have to look at it as a critic and find out all kind of possibilities that can go wrong in the successful running of this code. All kind of permutations and combinations have to be taken care of. Everything has to marry well between the business requirements and the code. The flow of application has to gel well with the business processes and flow.

    2. Investigator:
    At times things will not be as straight as they might appear. The report might say everything is developed as per the requirement but still something might be there beneath the carpet. You might be required to dig down further to get to the other side of the iceberg for a reality check. After all, when a product or software is released, the whole organization’s reputation is at stake.

    3. Go Getter:
    Everything might not go as smooth as free-flowing water in a river. You as an expert in exploratory testing has to cover a longer distance than a normal routine. You have to stay calm under all adverse situation with one goal in mind and that is to find out the minute of the bug in the software.

    Exploratory Testing Needs A set of unique qualities

    4. Storyteller:
    You have to be a good storyteller when it comes to explaining a bug in software to make the other person understand it to the core so that the next time no mistakes repeat.

    5. Communicator:
    Besides being an expert storyteller, your communication skills have to be extraordinary. That can happen only when you have clarity about everything you discover and its difference from what it should behave like. If you can convince yourself about a thing, you can very well convince others about the same. And that is a must in exploratory testing.


    January 13, 2019  10:59 AM

    How Sacrosanct Is Requirements Analysis in SW Project Management

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    business requirements, Software test design, Software testing, Testing

    If you are in project management the first and foremost important thing to check in your project management lifecycles if you have a place for requirements analysis. In my opinion, the requirements analysis is entirely different from requirements gathering. It includes requirements testing and validation along with parallel scrutiny with actual business processes in place. If you don’t perform all these and start your coding, you are definitely calling for big trouble at a later stage. It may, in fact, lead to an entire failure with a big setback to reputation, finances, business, customer loss, and time. Any project failure leads to a risk of losing the best talent in the pool especially if the project is not managed properly. Obviously, the best of the people would not like to stay at a place where risks and mistakes have a higher stake in the projects.

    It is very important to learn the deep connection between requirements analysis and testing your requirements. Knowing that testing requirements are very important is one thing, how to do it in the best possible way to avoid later accidents is an altogether different ballgame. As a matter of fact, the requirements analysis stage has to have ample time and best of the resources to ensure foolproofing in a wholesome manner. Once the coding starts, the entire focus shifts to timelines, testing, and execution. The analysis stage is over by then or it takes a backseat by then. A thorough QA check of business requirements, processes, wireframes, and mockups are very important before the beginning of coding. Anything in the requirements that is not testable is risky.

    Requirements Analysis Has To Be Heuristic In Nature

    Business scenarios and test cases have to be complete and clear. Anything vague is meaningless. There has to be a perfect strategy. As a matter of fact, any coding and later the implementation has to align well with one or the other requirement well. If it doesn’t, there was something wrong with the requirements analysis.


    December 31, 2018  10:33 PM

    2018 Roundup Quotable Quotes From My Various Posts of 2018

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    DH2i, Encryption, manageengine

    “Timely communication is very critical in business. It impacts business, in fact, in a huge way. If it is not timely, it loses its impact and effect.”

    – Jaideep

    “Users always resist changing. Similarly, management always fears to invest in newer technologies.”

    – Jaideep

    “Now, AWS Marketplace customers can buy and deploy CloudPassage Halo Server Secure for high-performance cloud workload security in a highly predictable and cost-effective way – via a single integrated AWS bill. As global enterprises rapidly embrace the cloud for mission-critical workloads and DevOps for application development, automated security that can operate at speed and scale is becoming a critical path. AWS Marketplace helps eliminate protracted negotiations to make it easy for our customers to securely embrace the cloud.”

    John Janetos, Director, Business Development, CloudPassage.

    “Sometimes IT shops use instance stacking to help reduce the number of operating systems and licensed core counts since Microsoft allows up to 50 SQL Server instances per OS to be installed. The problem here, though, is the creation of a scenario where all of an enterprise’s eggs end up in a single basket, and one outage can thus impact many instances. If you get the stacking ratio wrong the first time, it’s also hard to move instances.”

    Connor Cox, Director of Business Development, DH2i (http://www.dh2i.com/).

    “With dynamics of digitalization fast changing and massive adoption of cloud technology, there is a greater need for automation in the endpoint management space, as endpoints are the major entry points of cyber attacks. ManageEngine’s new cloud-based patch management solution is engineered to meticulously look out for such threats on the move, thereby keeping both data and endpoints secured.”

    Rajesh Ganesan, director of product management, ManageEngine.

    “Readying macOS/iOS systems with the necessary authentication, encryption, management controls, and reporting are necessary to ensure a secure and compliant deployment. Therefore, providing the same level of protection afforded to PCs is an important consideration when integrating these devices into the business landscape.”

    Jason Dettbarn, CEO, Addigy.


    December 25, 2018  6:43 PM

    Addigy macOS-iOS Enterprise Readiness Charter @addigy

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Apple, Apple iOS, macOS

    Two significant fears I remember regarding adoption of Macs we used to have when I was working with various enterprises as CIO/CTO was lack of support and visibility. Plus the exorbitant prices of Mac desktops at that time. Apple devices used to be a sign of luxury. That fear was across almost all enterprises and CIOs. But that is not the case anymore. When I spoke to CEO of Addigy Jason Dettbarn I realized that. Both the fears are no more there in existence. For enterprise support, Addigy is there to manage any size or volume of macOS/iOS ecosystem in an enterprise. And as far as pricing is concerned, Mac systems are well within the reach and not too high in comparison to Windows PCs and laptops. Looking at the advantage and reliability of Mac systems bring, even if there is a slight price increase, it is worth accepting.

    Addigy

    Source Addigy.com

    Addigy recently released strategies to ensure macOS/iOS readiness for the enterprise. The strategies include a good amount of groundwork behind it. The company highlights processes and procedures to strengthen macOS/iOS system management, security, and compliance. Addigy is a front-runner in providing cloud-based Apple device management software. Thus irrespective of geographies, the company is fully capable to support any enterprise across the globe. The latest release includes a seven-point checklist to ensure macOS/iOS enterprise readiness. Last decade has seen substantial growth in popularity of macOS/iOS devices for their significant enhancements in productivity, least help desk needs, fewer management costs, and unmatched overall user experience. The deployment of macOS/iOS devices, however, in the enterprises seems to be a difficult task to the administrator thinking it as a cumbersome process taking several steps to make the devices ready to match the security and regulatory computing environments.

    Addigy releases a strategy to strengthen macOS/iOS system management, security, and compliance

    Jason Dettbarn, CEO, Addigy says, “Readying macOS/iOS systems with the necessary authentication, encryption, management controls, and reporting are necessary to ensure a secure and compliant deployment. Therefore, providing the same level of protection afforded to PCs is an important consideration when integrating these devices into the business landscape.”


    December 20, 2018  9:44 PM

    Hybrid Cloud Services for VMware: All About IO Filters @JetStreamSoft V

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Business Continuity, Data Replication, Hybrid cloud, VMware

    This is the fifth and the last post of Q&A with Serge Shats, Ph.D., CTO and Co-Founder, JetStream Software. Previous four posts links are here:
    Post 1
    Post 2
    Post 3
    Post 4

    Q: How are IO filters used for cloud DR?

    A: Here are four scenarios in which IO filters can be used to replicate data for cloud DR:

  • 1. Business Continuity Cloud Services:
  • With data replication from the on-premises environment to a cloud service provider, the service provider can host a warm failover destination for the VMs running at the on-premises data center.

  • 2. Data Backup to Cloud Object Store:
  • With the same method of intercepting data on-premises, the data can be continuously replicated to a cloud object store for recovery. Data may be preprocessed for the destination through the specific object store’s APIs. Again, no snapshots are required.

  • 3. Point-in-Time Recovery for Continuous Data Protection:
  • By replicating data in a continuous stream instead of discrete snapshots, point-in-time navigation is possible for recovery of all data up to immediately prior to a critical event (e.g., malware intrusion).

  • 4. Cloud Data Protection Services for On-Premises HCI:
  • Rather than requiring a “like-to-like” model for cloud data protection, data replication from within the hypervisor itself can provide DR for Virtual SAN or third-party HCI, even if the cloud destination is running entirely different compute and storage hardware.

    JetStream

    Source JetStream Software



    Q: With respect to cloud DR, how do IO filters compare to other data capture methods?

    A: IO filters enable a continuous capture of data from within vSphere, which is a game-changer for cloud DR. Traditionally, organizations have looked to the cloud for snapshot-based backup, which has its place, but it is quite limited in terms of realizing true DR as a cloud service.

    It’s well understood that snapshots degrade application performance and by definition don’t support continuous replication. The tradeoff with snapshots is the shorter you want your RPO to be, the more snapshots you create, so the greater impact on runtime performance. Also, recovering a volume from many small snapshots will increase RTO. For true DR from a cloud service, continuous data replication from an IO filter gives a better, more efficient approach.

    Prior to the availability of IO filters, continuous data capture was possible, for example, by intercepting data in a vSCSI filter. This is how vSphere Replication accesses data as it makes snapshots for data recovery. The key problem with vSCSI is that it’s a private API intended for VMware’s use, and VMware provides no guarantee of support for third-party technologies that use vSCSI intercept.

    Another approach to continuous data capture is to install agents inside the VMs to replicate data in a stream. While this method can achieve RPOs of just seconds, it is an agent-based solution, which may raise concerns about security and compatibility.

    Lastly, virtual appliances typically run within their own VMs, so they are broadly compatible, and they generally don’t take snapshots, so they can stream data. The problem is that they either stand in the data path itself, introducing IO latency, or they require a filter or agent to intercept data.

    JetStream

    Source JetStream Software

    Q: What’s next for IO filters?

    A: While the IO filters API is primarily of interest to software developers providing data management services in the VMware ecosystem, interest has been growing recently, driven primarily by cloud and hybrid cloud use cases. In the future, it’s not difficult to see IO filters applied for uses beyond performance acceleration, live migration, and data protection to other types of policy-based data management.

    The idea of cloud services moving beyond disaster recovery and data protection solutions is feasible with on-premises IO filters enabling “X as a service” offerings, with the application of specific policies to data across an infrastructure comprising on-premises operations and cloud services.

    With an IO filter in each VM on premises, a solution can intercept and process every bit of data moving up and down the storage stack, and it can help the admin set data policies for those VMs, for any type of business requirement, such as cloud cost optimization or compliance. The key is that there is no need for an external data management framework — policy-based data management can be enabled within vSphere itself — across multiple data centers and cloud services.

    ###


    December 20, 2018  9:31 PM

    Hybrid Cloud Services for VMware: All About IO Filters @JetStreamSoft IV

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Hybrid cloud, VMware

    In this series of Q&A with Serge Shats, Ph.D., CTO and Co-Founder, JetStream Software we are in the middle of the discussion. This is the fourth post in the series. Previous posts can be accessed from the links below:

    Post 1
    Post 2
    Post 3

    Q: How are IO filters used for virtual machine live migration?

    A: The problem with live migration is this: How do you keep applications running, with new data being written continuously, during the hours — or sometimes days — that it takes to move the applications’ data to the destination? There are a number of approaches, as virtual machine migration is not a new problem. But IO filters provide a capability that’s much simpler than anything we’ve seen before.

    With JetStream Migrate, the software deploys as a replication filter in the source VMware environment. The migrating VMs’ configurations and virtual disks are copied from the on-premises data center to the cloud data center, and while that copy and transfer process is taking place, newly written data from the VM is captured by the IO filter and also replicated to the destination.

    One of the advantages of this approach is that the copy of the virtual disk can be moved over the network connection, or it can be copied onto a physical device for “offline” transport to the cloud destination. So if you are familiar with the Amazon Snowball, it’s now possible for an organization to use a snowball-like device to transport data from one VMware environment to another VMware environment, without having to stop the VMs or their applications from running at the source.

    HybrCloud Services for VMware

    Source: JetStream Software

    Q: With respect to disaster recovery (DR), why would someone use IO filters instead of snapshots?

    A: One of the key goals for using IO filters for data replication is that — unlike snapshots — data can be captured for replication without a detrimental impact on application performance. Also, because data is being captured in a stream, there are better options for delivering a variety of DR capabilities, such as an extremely low RPO and RTO, as well as very fast point-in-time recovery.

    We will be concluding this series in the next post. Continued »


    December 20, 2018  9:22 PM

    Hybrid Cloud Services for VMware: All About IO Filters @JetStreamSoft III

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Hybrid cloud, VMware

    This is the third post in this series of discussion with Serge Shats Ph.D., CTO, and Co-Founder, JetStream Software. He is talking about Hybrid Cloud Services for VMware: What You Need to Know About IO Filters. The first two posts can be reached here:
    Post 1
    Post 2

    Q: What are the advantages of IO filters?

    A: First, and perhaps most obviously, because IO filters run within vSphere, they truly achieve the goal of enabling “software-defined storage.” IO filters are designed to run with any type of datastore, including shared storage, VSAN/HCI or Virtual Volumes. Second, among the various software-oriented approaches to integrating third-party data services with vSphere, IO filters are the most “vSphere native.” IO filters don’t use any agents in the VMs, virtual appliances in the data path, third-party modules in the kernel, or calls to internal APIs. Solutions deployed as IO filters provide an assurance of support, compatibility, and stability that other approaches to software-defined storage can’t match. Of course, this becomes doubly important when we’re talking about cloud or hybrid cloud deployments, where abstraction is paramount.

    Q: How are IO filters used for storage IO acceleration?

    A: Storage IO acceleration was our first application of IO filters; it’s why VMware selected us to partner with them, as our IO accelerator served as a kind of reference architecture for the API as it was in development. JetStream Accelerate uses the caching filter to enable virtual machines to write data to a non-volatile memory device in the host servers so that when the time comes to read the data, if possible, the VM will read that data from the non-volatile memory cards or SSDs rather than having to traverse the SAN to get the data from the underlying storage.

    Reading data from host-based flash generally enables much faster application performance, and with sufficient memory and CPU, it allows increased virtual machine density as well. Host-based data access is especially important for latency sensitive applications, and data center operators also like the idea of deploying two to three times as many VMs on the same number of host servers without any performance penalty, just by reducing storage IO latency.

    Enterprises also benefit from greatly reduced storage overhead. For example, in an Oracle environment at a large telecom customer, we are seeing a 90 percent reduction in read operations against their storage arrays. That means that because they’re serving those operations from flash within the host systems, they don’t need to overprovision their storage for performance, saving a lot of money on their storage budget.

    Hybrid Cloud Services

    Source JetStream Software

    We will continue in the next post. Continued »


    December 20, 2018  8:51 PM

    Hybrid Cloud Services for VMware and IO Filters @JetStreamSoft

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud Services, Hybrid cloud, VMware

    This post is in continuation to the previous post. This is the second post in the series. We are in conversation with Serge Shats, Ph.D. about Hybrid Cloud Services for VMWare and IO Filters. Shats is co-founder and CTO of JetStream Software. He has more than 25 years’ experience in system software development, storage virtualization, and data protection. Previously co-founder and CTO of FlashSoft, acquired by SanDisk in 2012, Shats has served as a chief architect at Veritas, Virsto, and Quantum. He earned his Ph.D. in computer science at the Russian Academy of Science in Moscow, Russia. For more information, please visit www.jetstreamsoft.com, www.linkedin.com/company/jetstream-software-inc/ and @JetStreamSoft.

    Q: What are IO filters?

    A: The IO filters API is a feature of vSphere that allows third-party data services to be safely integrated into the data path between the virtual machine and its virtual disk(s), capturing data and events in order to provide some data management service. There are different IO filters for different data management functions, including:

  • Data replication for disaster recovery
  • IO acceleration with host-based non-volatile memory
  • Data encryption
  • Storage IO control
  • Hybrid Cloud

    Source JetStrem



    Q: How do IO filters work?

    A: An IO filter is a software component that intercepts data and events continuously, with very low latency. But there is more to IO filters than what they do at the VM level. It’s helpful to think about IO filters at the cluster level as well. IO filters are deployed from a standard VIB and installed by vSphere to every host in a cluster, including new hosts that are added after the initial deployment. Even the process of updating or uninstalling filters is managed across the cluster by vSphere. Once deployed, the filters’ operating parameters are defined by VMware Storage Policy Based Management (SPBM). So it’s fair to say that the API enables a third-party data service to act as though it is “VMware native.”

    We continue our discussion in the following post. Continued »


    December 20, 2018  8:39 PM

    Hybrid Cloud Services for VMware: All About IO Filters @JetStreamSoft

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Hybrid cloud, SanDisk, VMware, VMware vSphere

    Q&A with Serge Shats, Ph.D., CTO and Co-Founder, JetStream Software talking about IO filters in detail

    Over the past few years, we’ve seen a lot of new features introduced to the VMware platform. Many of these new developments were undertaken to make VMware an even better cloud and hybrid cloud platform. One of the less well-known developments may be one of the most important: IO filters. As organizations shift some or most of their infrastructure to cloud-based services or consume cloud services for data protection and business continuity, IO filters are becoming key to accomplishing some important capabilities, such as migrating VMs without interruption and protecting VMs in an on-premises data center from a cloud service. The VMware vSphere API for IO Filtering (VAIO) represents a significant step in how VMware can be used in cloud and hybrid cloud environments.

    We recently chatted with Dr. Serge Shats, CTO and co-founder at JetStream Software, about the company’s role in developing and applying IO filter technology for cross-cloud data management. Serge has led architecture and development at storage and data protection companies including Veritas, Quantum, and Virsto. He was CTO of FlashSoft Software, then engineering fellow at SanDisk after the company acquired FlashSoft.

    Q: Tell us about your role in developing the API framework for IO filters.

    A: Starting in 2014, while our engineering team was still at SanDisk, we began collaborating with VMware as the co-design partner for the IO filters API, so we have a rather extensive understanding of the API. It is a standard VMware API, and solutions that support the API are listed in the VMware Compatibility Guide and are certified “VMware Ready.” The VMware Ready certification ensures full support from VMware. JetStream Software has a large number of deployments of the software, mostly in large data centers running VMware for public and private cloud operations.

    We continue our interesting conversation with Serge Shats in the next post. Continued »


    December 20, 2018  6:57 PM

    Site24x7 Brings AI for Monitoring and Chatbot Integration @site24x7

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    ai, Artificial intelligence, Azure, Chatbot

    Microsoft Azure monitoring becomes stronger and easier with the help of AI-driven technology developed by Site24x7. Similarly, the technology also assists in Microsoft Teams Chatbot Integration. Let us understand how does it help the client in a substantial way. The foremost benefit the client gets is a drastic decrease in application outage resolution time with the help of AI-powered insights. Interestingly DevOps and application teams remain in the zone of work even while IT incidents happening. This happens with the help of Site24x7 chatbot for Teams. The good thing is, it is a cloud-based performance monitoring solution that works now more efficiently for DevOps and IT Operations. After this launch, IT teams can now manage more than 100 Azure products using Azure Insights API. All this happens in near real-time hence empowering IT teams to get timely alerts and thus taking appropriate action proactively thereby bringing down resolution time significantly.

    Site24x7

    Source: Site24x7.com

    All this definitely helps in gaining higher visibility of the organization into their hybrid clouds. Actually, after the deployment of Site24x7 chatbot for Microsoft Teams, it becomes quite easier for DevOps and application handling teams to gain a real-time picture of the health status of critical applications right in their workplace chat room. In fact, with the growing adoption of the hybrid cloud environment, the monitoring load increases in the sense that now the teams have to monitor not only the on-premise environment but also the multiple cloud system. The pace at which the global hybrid cloud market is growing is really phenomenal. From around $50 billion in 2018, it is estimated to touch almost $100 billion by the year 2023 as per MarketandMarkets, a B2B research company. If this growth has to sustain or go higher, a strong solution like Site24x7 needs to be in place.

    Site24x7 Enhances the DevOps experience

    Srinivasa Raghavan, Product Manager, Site24x7.com says,

    “With digital transformation picking pace, DevOps teams are happily embracing the public cloud for new workloads, but it comes with a few inevitable challenges such as getting end-to-end visibility, performance degradation and managing user experience of business-critical applications. With AI-driven monitoring and IT automation, the issues across private, public and hybrid environments help IT teams, to bring down mean time to repair incidents, thus improving productivity,”

    Bhrighu Sareen, General Manager, Microsoft Teams, says,

    “ChatOps scenarios built on Microsoft Teams empower users to collaborate, share critical documents, applications and communicate in real time. Site24x7’s AI-driven monitoring capability brings together developers, application teams and IT operations into a single efficient secure location in Microsoft Teams for quick problem identification all the way to resolution.”


    December 18, 2018  10:37 PM

    Enterprise Cloud Adoption – Go Elastifile Google Cloud Way @elastifile

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud Services, File storage, Google Cloud, Infrastructure management, SaaS

    Early access program opened December 11th. General availability begins in Quarter 1, 2019. Elastifile and Google Cloud introduce scalable, fully-managed file service for Google Cloud. Yes, we are talking about Enterprise Cloud Adoption the way future needs it. The way Google and Elastifile design it to tackle all current and future requirements of any enterprise. The solution aims to bridge traditional and could-native architectures including Kubernetes, Preemptible, Compute VMs, and Kubeflow. The key features include data persistence for Kubernetes, data resilience for preemptible cloud VMs, and mobilized machine learning. The solution becomes stronger and intelligent as it grows with the time post-deployment. The target business verticals include media & entertainment, manufacturing, life sciences, and traditional enterprise IT to name a few. But the solution is not limited to these verticals only. Any other medium to the large-sized enterprise can adopt the solution and leverage its futuristic strengths.

    Enterprise Cloud Adoption

    Source Elastifile.com

    Enterprise Cloud Adoption needs a highly Scalable Cloud File environment. This solution that we are talking about is designed to serve a broad spectrum of enterprise applications of any size and scale that require file storage. It is designed for standard protocols like NFS. In addition, it consists of a full suite of enterprise features like snapshots, multi-zone access, and so on. The solution would be the right fit for any industry vertical in my opinion. Especially those where the data grows at a phenomenal pace and needs best class file storage provisions. Its flexible service class options empower a business to gain access to unlimited snapshots. In fact, snapshots price astonishingly will be as low as $0.03 per GB per month. That is how it aligns cost and performance to any business need. There are three different models to chose from depending on an enterprise’s basic requirements.

    Enterprise Cloud Adoption – A new Paradigm

    The first option of Enterprise Cloud Adoption from Elastifile leverages Elastifile ClearTier technology. This technology integrates standard persistent disks and object storage within a POSIX-compliant, unified namespace. Typical performance: 2 GB/s BW @ 120 TB capacity. The cost? It is just $0.10 per GB per month. The effective cost comes out to be w/ 30% snapshots: $0.08 per GB per month. This option is ideally suitable for capacity-driven, cost-sensitive use cases. The second option leverages Elastifile ClearTier technology to integrate SSD persistent disks and object storage within a POSIX-compliant, unified namespace. Typical performance: 10 GB/s BW @ 120 TB capacity for active data. If the first option is for capacity optimization, this one is for general use. The cost would be $0.17 per GB per month. Effective cost w/ 30% snapshots: $0.13 per GB per month. As said, this is for general purpose use.

    The third option of Enterprise Cloud Adoption from Elastifile aggregates SSD persistent disks into a POSIX-compliant, unified namespace. Typical performance would be 15.6 GB/s BW @ 120 TB capacity. It provides high transactional performance, scalable to millions of IOPS. That is just phenomenal. Isn’t it? It is for businesses who want performance optimized. Cost? Just $0.30 per GB per month. Effective cost w/ 30% snapshots: $0.22 per GB per month. Ideally for workloads requiring high performance. Elastifile Enterprise Cloud Adoption solution is a cloud-native, truly software-defined service architecture. In addition, it is future-proofed to seamlessly leverage cloud technology advancements for years to come.


    December 17, 2018  11:25 PM

    Hyperconvergence Software by @MaxtaInc Creates A New Paradigm

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    hyperconvergence, Maxta

    It would be something significant obviously when a technical support services firm Trusource Labs LLC banks on Maxta Hyperconvergence software by Maxta Inc. Trusource Labs specializes in support for the IoT (Internet of Things) providing helpdesk services to the organizations using Apple devices. In fact, its customer range is quite versatile. It consists of startups on one hand and Fortune 100 companies on the other hand. The biggest challenge for this 2013 founded company in Texas was to ensure a strong collaboration with a scalable, flexible solution for supporting their growth outpacing in legacy IT infrastructure. Within 3 years, Trusource Labs was adjudged as the fastest growing business in Central Texas. Starting with just 20 employees in 2013, Trusource has now more than 600 employees. In addition, they have just started their international operations from Limerick, Ireland. That is a phenomenal growth. Maxta had a substantial role in it.

    Hyperconvergence Software

    Source: Maxta.com

    Oklahoma Wesleyan University with over 2000 students and 500 staff had a number of specific challenges. They were finding it difficult to manage their storage array that was running out of space. An investment of $30,000 for additional capacity was almost out of question. They wanted to opt for hyperconverged appliances but that too was a very expensive solution in terms of opex and capex. That is when they found a highly cost-effective hyperconverged infrastructure that was at par with industry standard servers. It had a comfortable ability to scale both, storage or compute as and when required by the university. The solution, in fact, was radically simplified storage management with no more LUNs (Logical Unit Numbers), provisioning, or a cumbersome capacity planning. The solution was in this case too was Maxta Hyperconvergence Software.

    Hyperconvergence Software that reduces capital and operating costs by up to 70 percent

    Texas Southern University is a much larger university in comparison to Oklahoma Wesleyan University. It has around 10,000 students and 1,500 staff. It had almost similar challenges in terms of their IT infrastructure. The existing traditional storage arrays were more than difficult to manage, mostly leading to misconfiguration. There was an overall lack of common management of storage arrays with a regular IT staff turnover. The university was badly in need of a simpler way to manage storage resources without hiring a storage administrator per storage array. After a good amount of research in the market, they found Maxta’s Hyperconvergence Software solution that was capable of delivering a complete and cost-effective mechanism to manage primary workloads utilizing TSU’s existing hardware assets. There was no need for specialized storage management resources. Capacity upscaling was quite easy by adding into an existing Maxta node rather than adding a complete node.

    TSU had a capability of refreshing server hardware without needing a repurchase of the hyperconverged software license. “With hyperconverged infrastructure, we can further utilize our hardware investments while bringing the data as close to the CPU as possible,” says Kelly Dean, Senior Systems Administrator, Texas Southern University. “That was one of the most important things, trying to not only simplify from a management perspective but also simplify in terms of the sheer number of pieces involved. In addition, I am looking at it from the perspective of, ‘What if I ever leave here? Can somebody come up behind me and understand how this works? You have to leave a place in a better state than when you got there,” concludes Dean.

    TSU had a capability of refreshing server hardware without needing a repurchase of the hyperconverged software license

    “SAN solutions start at $50,000 to $60,000 and if you run out of space, you have to upgrade all the drives or buy another one,” says Larry Chapman, IT Manager, Trusource Labs. “And you need a storage engineer to manage all the LUNs. You also have to have the personnel and a pretty big
    hardware investment. That’s not really scalable. We run a pretty tight ship in our engineering department. I don’t want to have onsite engineers at every location. We can remotely manage all this stuff. Because Maxta is so maintenance-free, I don’t have to double or triple or quadruple my staff. If you calculate that cost over years and years, I’m saving a ton of money.” he concludes.

    “A lot of the options were really expensive. I was initially looking at Nutanix and VxRail, trying to figure out how to afford to put that in my environment,” says Eric Goings, CTO, Oklahoma Wesleyan University. “Ultimately, Maxta is going to save us a lot more money than just the initial up-front cost,” he concludes.


    December 9, 2018  11:51 PM

    IoT and Hybrid Cloud in 2019 According to Don Boxley @DH2i

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    DH2i, Hybrid cloud, IIoT, Internet of Things, iot, VPN

    A recent interaction with DH2i CEO and Co-Founder, Don Boxley about the key technology transformations in 2019, he talks about two key developments he foresees in 2019 regarding IoT and Hybrid Cloud. So, his 2019 predictions go as follows. The first prediction says Enterprises will replace VPNs with micro-perimeters. This will become important for them in order to secure IoT gateway communications. This clearly means VPNs will vanish and so will the threats and vulnerabilities associated with them. Security is obviously the prime priority for enterprises. The dependence on data, technology, and the internet are at its peak. This comes with a bundle of threats though. But exploring and using technology is inevitable. It is the basic necessity now. The new product differentiator for enterprises is making smart products and IoT devices. Most of the devices are coming with IP addresses. Organizations are investing in IoT initiatives.

    IoT and Hybrid Cloud

    Organizations understand very well that the IoT gateways layer is the primary key to gain a high dividend on those investments in IoT initiatives. As we all know IoT gateways involve device connectivity, protocol translation, updating, upkeep, management, predictive and streaming data analytics. Not only that, in fact, it also involves a greater volume of data flow between devices and the cloud. Opening so many gates definitely increase the risks and this seeks a high level of improvement in the security of that high volume of data flow. Nothing better than a Zero Trust security model will work in such cases. Enterprises will have to replace VPNs with micro-perimeters if they have to secure IoT and Hybrid Cloud spectra. Micro-perimeters, understandably, remove an IoT device’s network presence thereby eliminating any kind of potential attack that is there in VPN. Zero Trust hybrid cloud security will become most critical.

    IoT and Hrybid Cloud Seek New Paradigms

    While many organizations are drafting or following a hybrid strategy to manage IoT and Hybrid Cloud. This strategy involves a deep integration between on-premise systems and off-premise cloud/hosted resources. VPN software solutions are getting obsolete in wake of the new IT world of hybrid and multi-cloud environments. Because VPNs were never designed keeping these newer transformations in mind. And if you try to align VPNs with these newer environments to create a secured world, it would be too complex to achieve. Moreover, VPN means a slice of the network for each user that easily create a lateral network attack surface with a higher amount of risk. The need is very different now. Enterprises require a new class of purpose-built security software if they have to do away with these risks.

    This new security software empowers enterprises to build lightweight dynamic micro-perimeters to secure application- and workload-centric connections between on-premises and cloud/hosted environments. This, in turn, ensures virtually no attack surface. As Don says,

    “In 2019, every hybrid cloud security strategy should be updated to replace VPNs with micro-perimeters.”

    “In 2019, every VPN used for a PCI application should/will be replaced with a micro-perimeter.”

    “In 2019, if a company’s hybrid cloud network security strategy relies on VPNs, the CEO should fire their head of network security.”


    December 9, 2018  9:20 PM

    Data Protection Automation Through An Innovative Breakthrough Solution

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Amazon EC2, AWS, AWS EC2, Data backup, Data protection, EC2, Hyper-V, Nakivo, VMware

    Definitely, the new oil is data. And looking at an increase in its exponential increase and importance, it needs an utmost attention towards its protection, safety, and availability. Any organization that brings out an innovative breakthrough in any of these three segments would immediately meet tremendous attention and response. NAKIVO brings a breakthrough solution for data protection automation. This includes backup and replication. In fact, it brings automation of core data protection tasks to its next level just a week ago. The solution includes a significant and unique functionality that empowers NAKIVO customers to put their VM data protection on an auto-pilot mode. This, in turn, simplifies data protection management providing a risk-free environment that any enterprise would need. NAKIVO Inc., a software company, is growing quite fast in virtualization and could backup spectrum by bringing unique and valuable solutions for the enterprise world.

    Data Protection Automation

    Source: https://www.nakivo.com/customers/success-stories/#

    The latest data protection automation solution is a part of NAKIVO’s Backup & Replication v8.1 release on 3rd. This is a revolution in helping businesses to manage their data protection chores by means of automation thus reducing manual intervention to a large extent. In addition, the release also includes universal recovery of any application objects. The two key points to note about this solution are – (a) Policy-Based Data Protection, and (b) Universal Object Recovery. As we all know, managing a large array of VMWare, AWS, or Hyper-V infrastructure is a difficult task. NAKIVO Backup & Replication v8.1 thus focuses on Policy-Based Data Protection. This means the customers can now create a backup, replicate, and backup copy policies to fully automate data protection processes with an easy configuration and good amount of flexibility. The policy parameters can be VM name, location, size, tag, power state, or any combination of these.

    Data Protection Automation Meets An ultimate Solution

    Once the policies are set up, the system takes care of scanning the whole infrastructure for VMs defined in the criteria and ensure complete protection of the VMs in an automated manner. This means now all critical VMs and ECs instances can be provided complete protection with almost zero manual input. That is, actually, a remarkable achievement. Similarly, the Universal Object Recovery ensures customers successful recovery of any application objects back to the source, a pre-defined location, or a physical server. This, in turn, saves a lot of valuable time and resources that an enterprise spends on restoration. A step ahead, the customer gets a leverage of recovering individual items from any application or file systems by mounting VM disks from backups to a target recovery location. The best part is, it doesn’t require to restore the entire VM first.

    Bruce Talley, CEO of NAKIVO Inc. says, “ We are expanding our product’s functionality to further improve reliability, flexibility, and ease-of-use. Policy-Based Data Protection in v8.1 is yet another significant step in this direction. By fully automating core data protection tasks, NAKIVO Backup & Replication minimizes the possibility of human error and helps customers gain more confidence in their data protection strategies.” NAKIVO is the winner of “Best of VMworld 2018” and the Gold Award for Data Protection. That itself speaks about its consistent growth and path-breaking VM Backup and site recovery solutions it brings on the table. And now an ultimate solution in Data Protection Automation that makes it a consistent pioneer in this field.

    Data Protection Automation is the need of the hour

    Honda, China Airlines, and Coca-Cola are a few names in their customer list of a large number of enterprises worldwide. You name a global standard storage system and it gets a full support of NAKIVO solution. So whether you are using a storage system like Synology, Western Digital, QNAP, ASUSTOR, or NETGEAR; and/or a high-end deduplication appliance like Dell/EMC Data Domain or NEC HYDRAstor, you can be assured to get a 2X performance and protection advantage with NAKIVO solution. Obviously with its new Data Protection Automation solution, this performance and protection increases manifold thereby providing enterprises a risk-free environment. The trial download is available here. To read some of the great success stories, click here. A Datasheet for reference is available here.


    November 19, 2018  12:03 AM

    Quobyte Storage Platform For Data Centers @Quobyte

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data Center, Data Center Storage

    If you want to measure the depth of technology, reliability, trust, and relationship in a solution providing organization, you need to look at the size and scale of its customers. Quobyte is the best example in this regard having wide acceptance on a global front for their state-of-the-art storage solution for any size data centers across the globe. UK Science and Technologies Facilities Council, earlier this year, deployed Quobyte to manage JASMIN Super-Data-Cluster that sizes to more than 40 Petabytes of storage. The reason for this is its massive scalability, operational ease, and unmatched performance. JASMIN is accessed globally by thousands of its users to search, manipulate, and analyze data. Around 1-3 PB of data is processed on daily basis. Quobyte’s Data Center File System empowers it to unify its file, block, and object storage datasets in a centralized system of more than 11,500 cores on around 600 nodes.

    Quobyte

    Source – Quobyte.com

    Earlier this year, NEC Deutschland GmbH, a leading HPC solutions provider partnered with Quobyte to develop and deploy a complete storage solution stack for HPC workloads that are based on NEC HPC hardware and Quobyte’s Data Center File System software. This solution, in turn, empowers enterprises and researchers a cost-effective and high-performance storage system. This system is easy to configure in terms of performance and capacity modules to ensure excellent operational efficiencies of hyperscales to HPC workloads. No other solution can match this massively scalable modern file system to manage any kind of storage architectures. This is, in fact, a highly configurable system that has a capability of easily scaling to hundreds of petabytes without putting an extra load on administrative costs yet managing large streaming workloads and handling millions of small file workloads. That is the beauty of this remarkable system.

    Quobyte can manage hundreds of petabytes of data easily

    Earlier this month, Actapio Inc. selects Quobyte to provide the best of the storage platform for its data centers. Actapio Inc. is a U.S. subsidiary of Yahoo Japan Corporation and thus is also commonly known as Yahoo! Japan. The internet giant with a massively scalable and fault-tolerant storage infrastructure was seeking the best possible data center file system and storage platform and thus ultimately zeroed down to Quobyte. On November 14, at OpenStack Summit – Berlin, Yahoo! JAPAN presented their deployment of OpenStack with Quobyte.


    November 15, 2018  11:32 PM

    DirectSearch Tackles Enterprise File Search Issue @Cloudtenna – 2

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    File management

    This is the concluding post about Cloudtenna and its unique innovative tool DirectSearch. You can read the previous post here. The key features of DirectSearch by Cloudtenna include AI Powered Accuracy, Security, Cross-Silo File Search, Granular Criteria, and its super speed to find the relevant files in no time. The product was given to 30 beta customers including online businesses, universities, and medium to large enterprises are using DirectSearch currently exploring six services that include Dropbox Business, Google Drive, Sharefile, Box Business, Jira, and Confluence. The production version will have additional access to Microsoft OneDrive, Outlook mail (including Office 365), Gmail, and all network drives/NAS (Network Attached Storage). Bryan Pham is the founder of Cloudtenna.

    DirectSearch

    Source Cloudtenna.com

    While Pham is an expert in cloud storage infrastructure, Cloudtenna cofounder Aaron Ganek is an expert in user experience. A seed funding of $4 million by Blazar Ventures and a strategic investment from Citrix is more than enough to state how promising the solution is for the international market. Andy Cohen, VP Corporate Development, Citrix says, “ Citrix is pleased to be part of the Cloudtenna investment round. Cloudtenna has outstanding new technology in intelligent search and data analytics and we are excited to be part of this round with Cloudtenna.” The talent in the organization includes top professionals from Silicon Valley like Rhapsody Networks, Symantec, NetApp, Oxygen Cloud, Sun Microsystems, Fusion.io, VERITAS, and EMC. The Engineering team includes key contributors to the NetApp WAFL and VxFS code bases. DirecSearch future developments include file management, auditing, analytics, and e-governance. The tool is already serving many enterprises.

    DirectSearch Works On Outstanding New Technology

    Pham who also serves as Cloudtenna CTO says, “Today’s workers are using literally dozens of file repositories, which has become a critical problem not only for individual productivity but for corporate IT departments and for vendors of platforms that would benefit from improved search functions. There are incomplete point solutions that partially solve a piece of the puzzle, but DirectSearch will revolutionize the way people find and work with files day in and day out.”

    Robert Poulin, founding partner, Blazar Ventures says, “Cloudtenna’s technology is viable today but also has broad implications for file and data management in a modern enterprise that combines local, cloud, BYOD, and SaaS assets,” said “We look forward to advising Cloudtenna on its enterprise, OEM, and direct market strategies so it realizes the full benefit of what it has to offer.”


    November 15, 2018  11:24 PM

    Cloudtenna Tackles Enterprise File Search Issue @Cloudtenna – 1

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    File management

    DirectSearch is a uniquely innovative product that aims to solve a big universal problem faced by almost all enterprises. I remember how difficult it used to be to ascertain the latest file having different versions on different locations. These locations could be a cloud, centralized file server, or local machines. Simultaneously so many working hours would go waste in search of an important file requiring some immediate action. This is still quite usual among most of the enterprises. Cloudtenna, a California based software startup company launches DirectSearch to overcome this problem in a phenomenal manner. This tool searches file across multiple platforms at a tremendous speed. These locations include clouds, local servers, remote servers, and local client machines. The time it takes for these searches is stunningly 400-600 milliseconds. The growing issue of file sprawl with enterprise file search is also because of increasing enterprise data at a fast pace.

    Cloudtenna

    Source Cloudtenna.com

    The new technology that Cloudtenna brings is already making waves in the news. Premium news and research companies like ZDNet, Forbes, The Register, Forrester, and many more are talking about this powerful and probably the most efficient tool. This is a common scenario among enterprises to have file across on-premise repositories, cloud file storage services, and hosted web servers. DirectSearch leverages the power of machine learning and artificial intelligence along with natural language processing (NLP) and rigorous automation to create this innovative all-new search engine. This search engine finds the required files that are scattered across network drives, cloud storage, email apps, and various hosted collaboration suites. More storage locations obviously mean more chaos and inefficiency. As per a report by IDC, a person spends more than 2.5 hours per day searching for files. That comes out to be around 30% of the time.

    Cloudtenna Leverages New Age Technologies To Build DirectSearch

    This IDC report clearly depicts that an enterprise having 1000 knowledge workers would waste around $50,000 per week accumulating to $2.5 million a year just for locating and retrieving information. If an enterprise is able to save that stupendous cost, it could lead to a phenomenal productivity and revenue in return.

    The second part of the article concludes in the next post. Continued »


    November 13, 2018  11:42 PM

    The world’s first facilities optimization software

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Artificial intelligence, Facility management, Internet of Things, Machine learning

    This is the concluding post of insightful interaction with Prabhu Ramachandran, CEO, and Co-Founder, Facilio. To know more about Facilio, you need to read the previous three posts in the series. Let us understand the power of World’s first facilities optimization software.

    The links to previous posts in this series are here:

    Post 1
    Post 2
    Post 3

    4. What will you say are the key features of the app?

    Facilio is the world’s first facilities optimization software that harnesses IoT and AI to leverage existing automation data and provide superior facilities experience. We’re built from the ground up keeping in mind property owners and facility managers, to help them achieve real-time operational efficiency, sustainability, and smoother tenant experiences across building portfolios.

    This is achieved through two suite offerings – Operations and Maintenance Suite and a Building Performance Suite.

    The operations and maintenance suite comprises work order management, asset, and space management, preventive and predictive maintenance as well as safety and fire alarm management. One of the critical concerns for building owners is the effective utilization of their task force. Facilio helps by automating work orders and even creating contextual tickets. By integrating with existing building management systems, it is able to assign tasks to teams with a detailed description of the problem causing components, all in real-time. Its mobile-driven features mean that your taskforce can get assigned tickets on-the-go, saving time and increasing effectiveness.

    Facilio also monitors the health of the assets and provides statistics on condition-based predictive maintenance. Its advanced fault warning systems mean lesser downtimes and quicker recovery. Similarly, its fire and safety management suite provides monitoring across multiple sites 24/7. It has inbuilt false alarm differentiation, notification capability across stakeholders and automatic work order creations in case of system faults.

    Facilio’s building performance suite provides energy analytics, water monitoring, and management and HVAC monitoring. The energy analytics are normalized for weather, time and occupancy levels and provide consumption heatmaps and regression analysis. With portfolio-wide comparisons, facility owners can finally take stock of sustainability performance across the portfolio.

    HVAC supervision allows predictive detection and diagnosis of equipment fault and performance inefficiency. Apart from trend and condition monitoring, the software also delves into root-causes of inefficiencies in the HVAC systems and integrates an automated workflow for immediate corrective action, creating a more solid resolution by maintenance teams backed by AI-driven contextual insights.

    While all suites can be used independently, the combination provides much-needed value through efficiency, improved productivity and intelligent decision-making ability.

    facilities optimization

    Source – Facilio.com

    Strengths of facilities optimization software

    5. Can you give a case study on how Facilio offers analytics which helps buildings at the time of crisis?

    Available here in https://facilio.com/resources/case-study/green-optima.html. You can take a synopsis from there.


    November 13, 2018  10:08 PM

    Real-Time Facilities Optimization IS The Ultimate Key

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    ai, Artificial intelligence, Facility management, IIoT, Internet of Things, iot, Machine learning, ml

    We are interacting with the Founder and CEO of Facilio, Prabhu Ramachandran. This is the third post in the series. The first post talks about an overview of facility management in India. The previous post, Prabhu tells us what changes do Facilio’s solutions bring in the day-to-day operations in a facility considering facility management still continues to be a traditional industry in India. To gain in the earlier posts, you can access the first post here, and second post here. Organizations must now think about real-time facilities optimization and a unified software solution to stay ahead.

    facilities optimization

    Source: Facilio.com

    Facilities optimization Is The Key To Success

    3. Give a brief on how Facilio has helped bring down operational costs and increase manpower efficiency for buildings.

    Facilio’s goal is to empower CREs and FMs (Corporate Real Estate and Facility Management) to be efficient value-partners and help them deliver exceptional services and experience. By providing a centralized IoT and AI-driven solution, Facilio puts them in the driver’s seat to proactively control operations rather than firefighting issues reactively.

    Facilio’s unified software solution rationalizes the asset-intensive buildings ecosystems and reduces functional complexity. It seamlessly integrates with existing building systems and predictively improves efficiency, through centralized and real-time facilities optimization.

    The global annual spend on facilities services stands at close to $1+ trillion dollars today, with the spend on buildings energy management almost close to that number. Any value that can be added to economies of such scale is itself enormous in potential. The advantage in technology adoption to unify operations and centrally manage portfolio performance in real-time is multi-fold and it helps to concurrently optimize the lifespan of assets and improve sustainability savings. Operating costs have always been a primary outlay for CREs and this software-led approach directly benefits the net operating income for property owners, while significantly enhancing the amenity value of the property.

    At Facilio we are extremely upbeat about the market because of our familiarity with projected enhancement in technology, as well as the response we have received so far. As a rule, even before savings are realized by our clients, the efficiencies generated by our solution results in their highly enthusiastic approval.

    We will have the concluding discussion with Prabhu in the next post. Continued »


    November 13, 2018  9:53 PM

    Organizations Need Smart Facility Management Solutions

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    ai, Artificial intelligence, Facilities management, IIoT, Internet of Things, iot, IT systems, Machine learning, SaaS

    Facilio has its offices in Atlanta, the United States, and Chennai, India. We are in interaction with Prabhu Ramachandran, CEO, and Co-Founder, Facilio. This is the second post in the series. You can read the first post here. In the previous post, Prabhu presents an insight into India’s position in facility management. To survive in this competitive world, organizations need a smarter way of managing facilities.

    Facility Management

    Source: Facilio.com

    Facility management needs a smarter way

    2. What changes do Facilio’s solutions bring in the day-to-day operations in a facility considering facility management still continues to be a traditional industry in India?

    As an industry by itself, facilities management largely involves people, process, and machines working together. This mandates huge operational spend and dealing with complex proprietary automation systems. With the emergence of IoT and AI, global facilities industry bolsters a significant disruption potential with the adoption of smart buildings technology fast reaping benefits.

    The challenge, however, is to establish this evolution without replacing processes in the current eco-system itself. Facilio has been designed in such a way that the old system need not be replaced or additional installments of any kind. It’s powerful IoT technology focuses on real-time data acquisition that drives AI-based insights for efficient operations, better asset health, sustainability, and smoother experience across building portfolios. The advantage in technology adoption to centrally manage and operate portfolios in real-time is multifold and Facilio helps to concurrently optimize the lifespan of assets and improve sustainability savings, in addition to creating a smarter workforce. This approach directly benefits the net operating income for property owners, while significantly enhancing the amenity value of the property.

    A recent analysis report by Verdantix, a leading global research consultancy specializing in energy, real estate, facilities, and maintenance, estimates that the global market for software and related IT services in buildings is currently estimated at $8 billion and is expected to reach $12 billion by 2022.

    We continue interacting with Prabhu in our next post. Continued »


    November 13, 2018  9:38 PM

    An interaction with CEO and Co-Founder, Facilio

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Artificial intelligence, Facility management, Internet of Things, iot, IT systems, Machine learning, SaaS

    We are interacting with Prabhu Ramachandran, CEO, and Co-Founder, Facilio. Facilio is World’s first IoT (Internet of Things) and AI (Artifical Intelligence) driven SaaS (Software as a Service) facilities management solution. In fact, this is first of its own kind AI driven facilities management SaaS start-up that is making waves in this particular segment. Prabhu is an ex-Zoho veteran with over 17 years of experience in IoT and headed WebNMS, IOT division of Zoho. Faclilio’s clients include SPI Cinemas India, Mazaya Business Avenue Dubai, and Universal Control Innovations USA. It shows the power of their unique solution and its universal demand across the globe.

    Facilio

    Source: Facilio.com

    1. In comparison to other countries, can you give an overview of where India is currently positioned in facility management?

    Indian real estate, in terms of commercial facilities, is pretty much reflective of global trends. With the growing number of coworking spaces and the need for workplace tools, India is significantly contributing towards the RE revolution.

    With facilities management – the industry is at the tipping point of disruption and we already see early adopters enjoying tech-enabled opportunities as a competitive edge to enhance their facility offerings and provide flexible benefits that improve both the bottom-line and the overall tenant experience. The next five years will most definitely see the effective adoption of IoT, AI, and Machine Learning by enterprises to manage facilities, just as customarily as their IT systems.

    The current approach to facility management is clearly siloed, steered by hardware-automation over the years. Facilio wants to declutter this – we want to provide a complete end-to-end solution that helps customers easily provide value to their end-users ie. Occupants. Our goal is to facilitate a disruptive evolution led by IoT and AI-driven software in the facilities management space, similar to other category transformations in the Indian context like e-commerce, ride sharing and travel stay.

    Siloed Approaches in Facility Management Will Not Work: Facilio

    We would continue interacting with Prabhu in the next post. Continued »


    November 9, 2018  10:42 PM

    Is Object Storage The Ultimate Solution To Data Management Challenges?

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data Management, Object storage

    Do you think object storage is the ultimate solution to data management challenges? Well, some figures say so. A recent research says more than 35% of enterprises use object storage currently. The figure was 28% a year ago. We all know there is a phenomenal increment in unstructured data in most of the organizations. The volume is increasing exponentially at a faster pace. So, the point is, is object storage able to ease data management challenges in a substantial way? The reality is that the whole concept of data generation and data storage is changing for almost all enterprises. It is because of the amount of data an organization is producing. Higher production of data, of course, demands a higher storage capacity. The boundaries and limits are changing their definitions at a high speed. Thus, what is the ideal way for an organization to tackle this exploding situation?

    Object Storage

    Photo credit: xmacex on VisualHunt / CC BY-SA

    Question is not only about the humongous unstructured data that an organization is finding difficult to handle. More severe point is the type of data it contains that includes text, images, videos, audios, etc. that too in various formats. That is where the necessity of object storage emerges in a most relevant manner. Because all this data is quite complicated and equally challenging to store and manage in an effective manner. To address this challenge there needs to be an efficient way of using and managing storage infrastructure. In today’s scenario, any size of storage infrastructure is short and small. The question is not only of storage, which itself is a complex and challenging task. The requirement is to make this data accessible and visible to create a substantial business value out of it. Object storage is the only option for enterprises to handle data management challenges.

    Object Storage is the Only Option

    Generating more capacities to cater to fast increasing data volumes is not the right solution. It is, in fact, time to tap new technologies and strategies like object storage for data management.


    November 9, 2018  10:08 PM

    SIEM Is Only a subset of MSSP For Enterprise Security

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Enterprise Security

    Security Incident and Event Management (SIEM) doesn’t denote a complete spectrum of enterprise security. It is just a subset of Managed Security Services. In fact, looking at the increasing number of threats and vulnerabilities, enterprises are seriously looking for a suitable Managed Security Service Provider (MSSP). Even to take care of enterprise security via SIEM, most of the organizations lack sufficient expertise having a thorough hands-on on deployment and management. That is where the need of an MSSP comes into the picture to take care of SIEM and additional security-related deployment and management services. This provides enterprises to leverage it as a service. For many industry segments like healthcare and finance, SIEM is mandatory. This can be achieved either as a service or having an in-house expertise. SIEM, as a matter of fact, ensures to take care of vulnerability management in the best possible manner.

    Enterprise Security

    Source: https://www.indiamart.com

    For Enterprise Security, it is important for an organization to hunt a suitable MSSP that not only understand enterprise’s needs but is also has the expertise to deliver it timely, optimally, and efficiently. This need is creating a large scope for MSSPs. While enterprises keep leveraging different and newer technologies to safeguard their organizations, it is important for an MSSP to upgrade and upskill themselves with a consistent pace. Also, the important point is that it is difficult for organizations to keep all expertise in-house resulting in higher hiring and retention costs, in case of an MSSP it is easily attainable to hire multiple security expertise at multiple levels. As a matter of fact, most of the organizations don’t have a proper security division in place. Keeping in mind that network security and compliance management are getting complex day by day for an enterprise to handle.

    Enterprise Security is getting complex day by day

    Therefore, MSSPs have a greater scope in the coming time to address the challenges of enterprise security with managed SIEM and other services offerings.


    November 9, 2018  9:40 PM

    Cloud Readiness Comes Well Before Cloud Management

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud management

    We all know this. But how many organizations really assess their cloud readiness before taking initiatives about cloud management. Of course, enterprises are finding our different ideas and ways to prepare themselves for migrating to the cloud. Moving on the cloud is important to do away with a certain set of complexities, high capex, frequent disruptions, and lack of talent to handle the on-premise model. That is why in most of the on-premise cases, you will find the involvement of multiple third-party vendors to manage. While many enterprises are already having the taste of multi-cloud and hybrid architectures. They are successful in cases where the cloud service providers have significant strength in supporting this migration or transformation. Only such vendors, in fact, would be able to create a long-term relationship with their clients. What organizations need is a long-term management plan that includes proper execution and optimized cloud deployments.

    Cloud Readiness

    Photo credit: Commander, U.S. 7th Fleet on VisualHunt.com / CC BY-SA

    Cloud Readiness involves a good preparation by enterprises for finding one of the best partners. Public clouds usage is increasing significantly. Even the highly organized and regulated sectors like banking and insurance are finding is safe. On the other hand, many enterprises hold themselves to use public clouds for some special projects and use private clouds for those. Vendors having expertise in managing multi-cloud environments and hybrid cloud architectures are high in demand. Enterprises prefer to partner with such third-party conversion experts right from the pre-flight stage with an arrangement to keep them aligned for boarding and in-flight stages. Cloud readiness and cloud governance are the two prominent factors to attain success in this matter. Voice of the Enterprise report states that more than 93% of enterprises are using cloud services in one form or the other. Interestingly, hardly 25% out of those have succeeded in IaaS implementation.

    Cloud Readiness Includes Finding A Right Partner

    Cloud Readiness includes many important factors such as Advisory, planning, and design as the first and foremost factor. Next come activities such as Operating model design, integration with existing resources, application refactoring, onboarding & migration, and creation of safe ‘landing zones’.


    November 9, 2018  8:48 PM

    Innovation in Retail Banking Led By Digital Commerce and Tech Giants

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Infosys, innovation, Retail banking

    Infosys along with Efma launches the 10th Annual Research Study of Innovation in Retail Banking. Efma is a global not-for-profit organization. Jim Marous, author of this report is also the publisher of the Digital Banking Report and Co-Publisher of The Financial Brand. The 10th anniversary of the research focuses on banking innovations by 2022. The report also explores the significant innovations in the banking sector during the last 10 years and their impact on banking and financing sector in India. EdgeServe Systems is a product subsidiary of Infosys. Infosys Finacle is one of the products of EdgeServe Systems. The pace of innovations has significantly increased over a period. And it is going to increase exponentially in the coming years. This is because of the focus and importance being put on innovations. Because innovations are resulting in substantial transformations thus helping in leveraging technology to a large extent.

    Innovation in Retail Banking

    Source: https://www.nppa.com.au/the-platform/how-it-works/

    The Annual Study of Innovation in Retail Banking had a good participation of more than 300 banks. The outcome finds banks prefer Open Banking APIs as one of the key factors for the future of innovations. The other areas of interest are Machine Learning (ML), Chatbots, and Robotic Process Automation (RPA) to leverage the power of Open Banking APIs. In fact, in short term, the report ascertains, banks will focus on automation of various compliances like Payment Service Directive (PSD) (Europe), Unified Payments Interface (UPI) (India), and New Payments Platform (Australia). The research also concludes that organizations are taking innovations as an integral process in an objective manner by calculating innovation ROI. In fact, in 2018, 63% of the organizations had a target of achieving ROI in 1-3 years in comparison to 54% in 2017. This shows the confidence of organizations in innovations and its significant returns.

    Innovation in Retail Banking is Heading Towards Major Breakthroughs

    The participating banks in this research on Innovation in Retail Banking confirm that digital commerce platforms like Alibaba, Uber, Amazon, etc. and technology giants will collaborate to be the innovation leaders for 2022. They also ascertain that the major innovations will happen in the areas like payments, mobile wallets, paperless transactions, and lending in the next 2-3 years.


    November 5, 2018  9:37 PM

    Want to Eliminate Data Breaches? Time to Deploy A Zero Trust Architecture

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    architecture, data breaches, DH2i

    It is no secret that perimeter security is one of the greatest challenges facing today’s security and network admins. Conventional methods for connecting and protecting on-premises sites and/or multi-cloud environments, such as virtual private networks (VPNs), have proven to be ineffective. In addition, outdated methods and technologies have presented other problems as well. For instance, outdated technology and methods lead to complex configurations needing dedicated routers, ACLs and FW policies which greatly increase risk. Users only get a small slice of the network, which generates a dangerously vulnerable lateral network attack surface. At the same time, the inbound connection creates attack surfaces as well (e.g., DDoS). And, there is simply no ability to reduce attack surfaces with application-level segmentation.

    There are a number of vendors that have introduced solutions promising to overcome some or all of these problems, with minimal success. However, DH2i has launched a new software platform called DxOdyssey, that appears quite promising. Now generally available, DxOdyssey software dynamically deploys perimeter security where needed in order to isolate services for fine-grained user access, creating a Software-Defined Perimeter (SDP). The new DH2i SDP ensures the perimeter security model needed to achieve a “Zero Trust Network” also known as a “Zero Trust Architecture.” This concept was first introduced by Forrester and basically says that organizations should not automatically trust anything – inside or outside its perimeters – and must always verify everything trying to connect to its systems before granting access.

    Zero Trust Architecture

    “A ‘zero-trust’ methodology assumes that assets, users and resources are untrusted.  To achieve a zero-trust architecture takes an extremely focused and dedicated effort, but can strengthen most organizations’ risk posture,” said Eric Hanselman, Chief Analyst, 451 Research. “IT professionals seeking to secure today’s more complex hybrid and multi-cloud environments and build highly available, application-aware capabilities should consider offerings such as DH2i’s DxOdyssey to move beyond the complexity and limitations of traditional VPNs.”

    According to DH2i, DxOdyssey cross-platform SDP software overcomes the previously mentioned issues without the use and security risks of conventional networking connectivity approaches. DxOdyssey SDP software delivers:

    • – Micro-Perimeters –

    Application-level micro-tunnels give network admins the ability to deep segment by application, not by network. Limits remote users to fine-grained access to specific services. No ACLs or FW policies to manage. Eliminates lateral network attacks.

    • – Discrete Invisibility –

    Randomly generated non-standard UDP ports for dynamic on-demand micro-tunnel communications. Servers are cloaked and secured with no open ports. Virtually eliminates network attack surfaces.

    • – Multi-Cloud Secure –

    Designed to enable secure communications “from any host, to any host, anywhere” with application-level DTLS encrypted micro-tunnels and Public Key Authentication. Scales across environments to build a secure hybrid/multi-cloud distributed application infrastructure. No cloud vendor lock-in.

    • – Smart Availability –

    the Dynamic movement of micro-tunnel gateways and application workloads with self-healing automatic fault detection and failover. The perimeter can be orchestrated to change dynamically so that micro-tunnels and workloads always find their best execution venue (BEV). The entire application infrastructure is “always-secure and always-on.”

    • – Lightweight software –

    Software-Defined-Perimeter solution. Just install on any host and connect. Integrates into existing network infrastructure. No network reconfiguration. No appliances to deploy, configure or maintain.

    “Traditional perimeter security solutions are obsolete for the new IT reality of hybrid and multi-cloud. They weren’t designed for them. They create too large of an attack surface. One need only open today’s paper to confirm this fact. Customers need a new perimeter security model to support hybrid and multi-cloud computing,” said Don Boxley, CEO, and Co-Founder, DH2i. “DH2i’s SDP solution, DxOdyssey, is purpose-built for this new perimeter security reality and is going to disrupt the multi-billion cloud VPN market* because it enables organizations to move away from using a traditional VPN and all of the associated issues for perimeter security. Instead, with DxOdyssey customers can build lightweight dynamic security perimeters to secure application- and workload-centric connections between on-premises and/or multi-cloud environments, with virtually no attack surface.”

    DH2i is entering into a market space that is rich with opportunity. Overall, the global VPN market is valued at $45 billion U.S. dollars in 2014 and forecast for $70 billion U.S. dollars in 2019 (Statista 2018).


    October 31, 2018  5:06 PM

    5 Game Winning Strategies In A Global Environment

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja

    Recently there has been a phenomenal partnership or collaboration between two companies from two different parts of the world. These two organizations are VEECON ROK and BSNL. Both are leaders in their own areas. While BSNL in India owns the largest network and user base. VEECON ROK has recently launched the world’s first 3D smartphones. The collaboration is a kind of strategic tie-up where the manufacturing, quality control, and design will happen in China. For India, its marketing and support will be handled by BSNL. There are few game-winning tactics in this collaboration that makes a lot of sense to have high success rate and low risk. Here are those

    • Chose Right Partners:

    Choosing right partners makes it a win-win preposition right before the beginning of the game. It plays the vital role in inculcating a right mix of confidence and conviction. These two are very important factors to win a game in any conditions. BSNL has the deepest penetration in remotest of the population in India. That makes it possible to gain maximum possible leverage that no other player in the country has.

    • Most Economical Production:

    Setting up a production unit in China makes it possible to produce the product in best possible controlled processes and lowest cost. That will help in getting more and more buyers in the international market.

    • Best Possible Design Facility:

    The design center is in London that ensures highest quality of design.

    • Double Layer of Quality:

    There is a full team in China factory premise to ensure best of the quality in terms of quality assurance and quality control. On top of it there is another quality team in Portugal and London that oversees the quality team working in China factory. This creates a double layer of quality ensuring least leakages during and after production.

    • Best Possible Collaborations:

    The global technology tie-up with NASA is one of its own kind. NASA definitely would not collaborate with anything less than it in terms of technological depth. Similarly distribution tie-up with BSNL for India is another great example.

    These kind of strategies and decisions definitely win half of the game even before it has begun.


    October 31, 2018  2:37 PM

    VEECON ROK and BSNL announce Connecting India – 4

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    3D, BSNL

    This is the fourth and concluding post in the series. Here are the links to the previous posts:
    First post
    Second Post
    Third Post

    We continue here from the third post.

    Anupam Shrivastava, Chairman, and Managing Director, BSNL (Bharat Sanchar Nigam Limited) said, “I am pleased to be here at the launch of City Wide WiFi project for 25 cities. This is indeed going to be a landmark initiative towards providing internet to every Indian, quite in sync with the government of India’s Digital India programme. In the wake of arguably cutting edge technology and government initiatives, we are also pleased to associate with an emerging player like VEECON ROKiT and roll out this new range of mobile handsets.”

    This Veecon ROKiT range of mobile phones will be available from the first quarter of 2019. The sale and support will through the BSNL outlets cities.

    VEECON ROK CORPORATION PVT LTD and ROK.Corporation USA. Former is a diversified international conglomerate with business interests in Energy and Power, Technology, Security & Surveillance, Media and Broadcasting, Music and Entertainment, and Tourism. ROK Corporation USA, on the other hand, comes into existence with the purpose of providing WIFI and other telecom related services to different smart cities of India and other neighboring countries. ROK Corporation us actually the company that manufactures 3D mobile phones. These phones now are going to be introduced for the first time in the world.

    We all are familiar with Bharat Sanchar Nigam Limited (BSNL) that is an Indian state-owned telecommunications company having its headquarter in New Delhi. It was formed on 15 September 2000 for the purpose of providing telecom services to the country and network management. As a matter of fact, it is the largest provider of fixed telephony and broadband services. In fact, it has more than 60% market share and is the fifth largest mobile telephony provider in India.


    October 31, 2018  2:30 PM

    VEECON ROK and BSNL announce Connecting India – 3

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    3D, BSNL

    This is the third post in the series. Previous posts you can read by clicking on the respective links. Here is the link to the first post in the series. Here is the link to the second post in the series.

    “The City Wide Wi-Fi Network of 25 cities is in sync with our smart city Initiative and would be taken ahead with our Superior technology to drive this. We believe public and private partnerships affect serious positive change and we could not have found a better partner than BSNL to take this initiative ahead. The amalgamation of BSNL’s reach across the country and our cutting-edge technology is certain to be a game changer in the business of running a city. For this project, we plan to invest $5 bn in the next three to five years.” Srivastava adds further.

    When we talk of a smart city, some of the things need to be part of that ecosystem by default. Availability of high bandwidth without any disruption is something that becomes a basic necessity in that scenario.

    Jonathan Kendrick says, “We are glad to have introduced the first glasses-free 3D mobile phone here in India. India is one of the fastest growing economies in the world with a huge consumer base for mobile handsets and we see a huge potential in the country for our products.”

    John Paul DeJoria, says, “In line with the Government of India’s digitization drive, the City Wide Wi-Fi Network in 25 cities across India will bring the world closer to the people. We are thankful to BSNL for partnering with us to drive this ambitious project forward. We are positive about the success of the pilot project and looking forward to embarking on the next phase soon.”

    We will continue this interesting series about a phenomenal global collaboration in the next post.


    October 31, 2018  2:25 PM

    VEECON ROK and BSNL announce Connecting India – 2

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    3D, BSNL

    This post is in continuation of the previous post. You can visit the previous post by clicking here. This includes India’s first 3D mobile phone that runs 3D videos perfectly without any requirement of wearing 3D glasses. In fact, this mobile brings a new spectrum in the field of design, hospitality, education, architecture, and medicine. The cities covered in this City Wide Wi-Fi Network project include Vijayawada, Navi Mumbai, Bangalore, Chennai, Hyderabad, Kolkata, Varanasi, Ghazipur, Panaji, Pune, Lucknow, Ahmedabad, Bhopal, Jaipur, Patna, Cochin, Guwahati, Tirupati, Shimla, Chandigarh, Noida, Gurugram, Dehradun, Indore and Agra. Five models of VEECON ROKiT mobile phones cover all segments of the price and technology range. All the five handsets were there on display to explore and experience. Next day the company had to display the same technology at Mobile Congress 3-day event at Aerocity.

    It was quite hopeful to attract a large amount of audience and create a good level of excitement and enthusiasm among media and visitors. The ROKiT One and ROKiT F-One are entry level classic calling devices featuring WiFii calling. The next level phone in the hierarchy is ROKiT IO Light. As its name suggests, this is a lightweight smartphone yet with impressive processing power. The top two models are the ROKiT IO 3D and ROKiT IO Pro 3D. These two, in fact, are the first glasses-free 3D-smartphones being offered for sale in India. All the ROKiT mobile phones will be available for sale from early next year. These will be bundled with BSNL SIMs and support.

    Gaurav Kumar Srivastava says, “We believe technology should improve everyone’s quality of life. The ROKiT range of phones are going to revolutionize the smartphone business the country as these are backed by the rich technological prowetech-savvy Engineering. The unique feature of the 3D mobile phone is you don’t need to wear 3D eyeglasses to use it. Indian consumers today are well read, tech savvy and expect new technologies to come their way to surprise them. We are upbeat about the brand’s prospect in India as we believe with our ROKiT range we will be able to strike the right chord with them.”

    Continued »


    October 31, 2018  2:20 PM

    VEECON ROK and BSNL announce Connecting India – 1

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    3D, BSNL

    VEECON ROKiT 3D mobile phones were unveiled recently in New Delhi. Anupam Shrivastava, Chairman & Managing Director, BSNL was present on the occasion. A project in which BSNL gets involves ensures its grand success on the basis of various rich legacies it carries. Firstly, it is the oldest and largest network in the country with the highest number of subscribers and customers. Secondly, it has deep penetration to remotest of the locations in India. Gaurav K.Srivastava, Chairman, VEECON ROK Corporation was also there at the event to announce BSNL’s unique, most ambitious, and easy to attain mission in collaboration with VEECON ROK Corporation. Jonathan Kendrick, Chairman and Co-founder, ROK Group of Companies along with John Paul DeJoria, Co-founder, ROK Group of Companies and many other top officials from their London, China, and Portugal offices were also present. When it comes to BSNL, it ensures low risk and high success.

    The mission is to Connect India with the help of 25 City Wide Wireless Networks and to declare the onus of representing the world’s first ever 3D mobile phone in India. VEECON ROK Corporation would be investing more than $5 bn in India for this WiFi project. The target is to complete this deployment and connect 25 chosen cities in the next three to five years. VEECON ROK Corporation Pvt. Ltd. is a joint venture between the two legendary companies of global repute viz VEECON Group and ROK Corporation. The project comes rightly in the wake of Government of India’s digitization call. Its sole aim is to provide internet access to every Indian. The City-Wide Wi-Fi Network, therefore, will provide internet access across large segments of India. The City Wide Network will be operated jointly by VEECON ROK in association with BSNL.

    On this occasion, VEECON ROK Corporation also announced a new range of five Veecon ROKiT mobile handsets.

    Continued »


    October 30, 2018  11:56 PM

    DST-CII India Italy Technology Summit 2018

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Artificial intelligence, cybersecurity, Ecommerce, India, Italy, Mobility, Renewable energy

    DST-CII India Italy Technology Summit, a 2-day event was on 29 and 30 October this month at Taj Diplomatic Enclave. DST is Department of Science & Technology coming under the Ministry of Science & Technology, Government of India. CII is Confederation of Indian Industry. The valedictory session on the 30th afternoon was the key attraction that included speeches by Italian Prime Minister H. E. Giuseppe Conte and Indian Prime Minister Narendra Modi. In fact, both the prime ministers sharing the same dais shows the gravity and intensity of the matter. It is practically important to collaborate in today’s world to achieve success faster rather than reinventing the wheel in isolation and waster the energies. Indian space scientists have given a superb technology to the world recently by using which the countries across the globe have to spend peanuts equivalent of money now to launch their satellites in the space.

    Technology Summit

    That is a phenomenal achievement indeed as mentioned at the Technology Summit 2018. And of course, besides other countries, Italy also is one of the countries to avail this technology and lower down its launch cost for satellites. On the same grounds, there are a lot of technological advancements happening in Italy that India can leverage to strengthen in those fields. These fields are basically Digital Innovations, Robotics, Bionics, Automated Vehicles, Design, Renewable Energy, Education, Aviation, Airport Infrastructure, Commercial Satellites, etc. As a matter of fact, Convergence and Innovation is the need of the hour from both the ends. Business collaboration in the field of data and machine learning is one such area where both countries together can do wonders. Similarly, in the field of ICT (Information Communication Technology), 5G and Fibre are the areas that both can explore together the achieve new heights and benefit together.

    DST-CII India Italy Technology Summit was a great success

    As mentioned during the Valedictory session at DST-CII India Italy Technology Summit 2018, Italy’s manufacturing strengths are world known and Indian manufacturing has a lot to learn from there. Promotion and collaboration in Fintech is another such field that is important to explore for both the countries. Artificial Intelligence (AI), Cybersecurity, Research, Mobility, eCommerce, Academics, Startups, Social Challenges, and competencies are the areas where both countries have achieves substantial achievements. Collaboration in these fields will definitely benefit both.


    October 26, 2018  9:50 PM

    InterraIT Conclave Mulls The Future of Artificial Intelligence

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Artificial intelligence

    AI (Artificial Intelligence), ML (Machine Learning), and VR (Virtual Reality) are the newer areas in technology that need deep penetration and faster adoption. Organizations worldwide are just talking about these technologies. But how many of them, in reality, are moving forward to adopt these technologies. Ultimately, if there is something there that can do wonders for an organization and its stakeholders including customers, then why not do it now rather than waiting for others to adopt, succeed or fail, and then take a call. Probably, by then, those organizations those just wait and watch will fall so behind the race that it will be difficult for them to cope up and get back to their leadership position in the market. Keeping that in mind at the InterraIT conclave, the organization decides to take a step forward and embrace Artificial Intelligence (AI). This will help to reorient India’s IT ecosystem.

    Prominent experts and industry leaders were present at InterraIT conclave last week. InterraIT has recently partnered with a global firm that is a pioneer in autonomous testing. The organization is already having a good reputation and wide presence in India to take it ahead and leverage it to its full strength. The plans are quite aggressive to lead the technological changes. In fact, there would be more such alliances taking place soon. Asoke K Laha, Founder, President, and CEO of InterraIT says, “We need to look inward at domestic markets with a view to bringing to the table sophisticated new technologies like open-source software, cloud computing, data analytics, and now artificial intelligence.” There were many other prominent experts present at the conclave to suggest the right strategies to stay ahead in competition on a global frame. After all, profitability is the key to success.

    InterraIT Conclave Brings Forth Various Ideologies on AI

    Present at the InterraIT Conclave, Dr. Prabhat Kumar, E-Governance, and AI expert says, “These technologies are already available and we need to adapt them to our needs. There’s no need to reinvent the wheel.”. In the same context, Puneet Jindal, Founder Data Science Delhi, and Lead Data Scientist at RateGain said, “AI has been around for decades; it’s just that new programming has come into play and is easy to adopt. We must focus on the basics and build simple things to excite the market.”. Aniruddha Guha Sarkar, Senior Vice President, InterraIT says, “We need to align them with customer demands in order to say ahead of the challenges we face in the area of customer satisfaction.”


    October 17, 2018  9:54 PM

    Archive360 Announces Full Support For Microsoft Azure Data Box – 2

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Azure, Azure Backup, Cloud azure, Microsoft

    This is the concluding post of the previous post. The Microsoft Azure Data Box facilitates its customers with a more secure method for faster, more reliable and simple transfer of any volume of data to Azure. Customers only need to connect it to their network first. Next, they are required to load data onto the Data Box using standard NAS protocols. Data is completely safe using 256-AES encryption. After loading all the data, this Data Box is then couriered/sent back to the Microsoft Azure Data Center for the purpose of uploading of all the data to Azure. Thereafter, the device is securely erased which then gets ready for the next use. As a matter of fact, Archive2Azure is an intelligent information management and archiving platform that enables organizations to capture, extend, onboard and manage numerous types of structured and unstructured data into their Microsoft Cloud (Microsoft Office 365 and Azure).

    Microsoft Azure Data Box

    This holds true for organizations of all sizes across and every industry vertical, The solution works in a way that it adheres to all regulatory, legal and business requirements in a legally defensible and compliant manner. Archive2Azure has a unique way to addresses any organization’s key requirements, that include ensuring all their data is stored in its original format. It also ensures it is WORM compliant when needed in addition to addressing its security, fast retrieval, and availability concerns. The sole aim of Archive2Azure is to enable all customer data that its customer can maintain in its own Azure tenancy. While doing this, the customer can assign retention/disposition policies in a smooth, easy, and flawless manner. In this kind of jobs, it is very important for any such system to maintain a chain of custody. It should also ensure to provide a seamless mechanism of powerful audit and reporting.

    Microsoft Azure Data Box

    After all, it is important to know that your customer works with a complete peace of mind. Archive360 is a pioneer in intelligent information management for the customers on the Microsoft Cloud. Its rugged, well-tested, and fail-free platform helps organizations of any size having any volume of data to drive down the cost, uncertainty, and risk of digital transformation to and in the cloud. Through its solution, it onboard, validates, and manages all digital assets of an enterprise along with ensuring a meaningful predictability, data insights, analytics, and defensibility. It also ensures security-focused infrastructure independence by providing non-proprietary information management. Archive360 has a global presence via a reliable network of partners. The Archive2Azure Platform is Certified by Microsoft Azure.


    October 17, 2018  9:43 PM

    Archive360 Announces Full Support For Microsoft Azure Data Box – 1

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Azure, Azure Backup, Cloud azure, Microsoft

    September 25th was another day of remarkable achievement when Archive360 announced full support for Microsoft Azure Data Box. And full support means a well tested rugged system in place with zero failure rate. Because this homegrown system comes as a result of thorough research, development, and testing. In fact, the solution was one of the highlights at recently concluded MS Ignite show. As a matter of fact, Archive360 is among the first to deliver full support for Microsoft Azure Data Box Family. This integration with Azure Data Box has multiple benefits for existing and new Archive360 clients. It enables them a more rapid and Cost-Effective solution to transfer large data volumes to Azure. This comes with an assurance that there will be no impact on bandwidth or network resources. The whole system works in a seamless mechanism. Archive360® is a pioneer in providing intelligent information management solutions.

    Azure Data Box

    With many firsts to its name already it is now among the first to deliver full support for Microsoft Azure Data Box. In addition, it is also among the first to be added to the Microsoft Azure Data Box Partner Program. That gesture itself indicates a high level of reliability and confidence. In fact, all existing clients of Archive360’s Archive2Azure™ will now be able to experience an altogether rich experience in easier and faster data migration to Azure. The best part is the whole solution is in highly cost-effective, completely secure, following all legal and regulatory compliances.

    Archive360 is the first to deliver full support for Azure Data Box

    Bill Tolson, Vice President of Marketing, Archive360 says, “Our Archive2Azure customers rely on us to onboard massive volumes of many different types of data as a critical element of their data estate modernization initiatives. The Microsoft Azure Data Box enables us to extend these capabilities with a new option that ensures absolutely no impact on bandwidth or network resources. By combining Archive360 with Azure Data Box, clients can complete migration initiatives in less time, and accelerate their time-to-value in Azure. We are delighted to extend our alignment with Microsoft and proud to be among the first to be approved and added to their Azure Partner Program.”

    Dean Paron, Director Azure Data Box, Microsoft Corp. says, “We’re pleased to have Archive360 as a partner for Microsoft Azure Data Box. Working together, these solutions streamline large-scale data migration to the cloud and help customers accelerate their Azure investments.”

    We shall continue in the next post. Continued »


    October 2, 2018  5:57 PM

    Data Accuracy Platform by Naveego Brings A Revolution – 3

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Big Data, Data Management, Master data management

    In continuation to the post 1 (here) and post 2 (here), we continue with the concluding post on Data Accuracy Platform by Naveego that is no less than a revolution. Probably this is the only product in the world that promises an 800 percent ROI within 60 days of implementation. That is absolutely a phenomenal preposition for any organization to go for. We were discussing the losses organizations and governments are incurring due to poor quality or data and its handling. On an average, an organization is required to spend $100 for each incorrect record it carries.

    That simply means it takes an investment of $10 million to maintain 100,000 incorrect records. There is another aspect to it. Organizations are hiring high-value scientists for creating valuable outcomes. Instead, more than 80 percent of their time goes in collecting and cleansing of this data. Data scientists, in fact, are performing “data janitor work”.

    It is really unfortunate for organizations and Data scientists to not able to focus on actual data analysis. Data Accuracy Platform by Naveego brings a perfect solution to this. It provides organizations with a 360-degree assessment of information assets across traditional as well as new data. The source of traditional data would be CRM, ERP, and MDM systems. While the source of new data would be web searches, IoT sensors, social media, streaming data, clickstreams, and so on. The solution beautifully connects all sources of data into a single view. This way businesses can not only proactively attain global data health in one place but also utilize their high-salaried data scientists to perform accurate data analysis. Data accuracy, data consistency, information availability, accessibility by business decision makers thus doesn’t remain a bottleneck for the business.

    Data Accuracy Platform by Naveego is easy to deploy and highly affordable

    Above all, the Data Accuracy Platform by Naveego is easily affordable, easy to deploy, getting quick results. That too with 800% ROI in 60 days.

    Katie Horvath, CEO of Naveego says,

    “The race to drive competitive advantage through the better use of information assets is leading to demand efficiency – which means demand for reliable data. In order to monetize data and obtain ROI on data collection investment, efficiencies must be achieved to get clean and accurate data that businesses can rely on. This is why demand for Naveego’s Complete Data Accuracy platform is exploding. In the Hadoop world of big data, and the ‘new data’ from sources not managed by traditional master data management installations (such as IoT data), businesses need a solution that brings together and manages both new and traditional data from disparate sources, scrub it, and deliver it ready for consumption analysis and business productivity and profitability.”

    Breitburn Energy Partners LP uses Data Accuracy Platform by Naveego. Mike Kasprzak, Data Administration Services Manager, Breitburn Energy Partners LP says,

    “Traditional on-premises data quality solutions are expensive to deploy, require teams of specialists and take months before delivering any value to the business. Naveego stands apart with a seamless, affordable solution that empowers our analysts with quality data they can utilize immediately to more efficiently do their work. They can focus their skills and expertise on creating value based on reliable data – a huge advantage for both business and compliance reasons.”


    October 2, 2018  5:12 PM

    Data Accuracy Platform 2.0 by Naveego Brings A Revolution – 2

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Big Data

    This is the second post in the series. To read the first post you can go here. As we see, Data Accuracy Platform 2.0 from Naveego tackles Big Data in a very intelligent manner. It manages it without getting into any complexity or a substantial cost of customized and time-consuming installation. In fact, the installation is quick and there is nothing like ongoing maintenance. How Naveego does it is interesting to know. It is a good amount of research that makes them arrive at a solution that uses API connections. Thus the solution simply connects with any kind of data that too at its point of origin without needing a push or pull of data. This, as a result, creates a win-win situation for any organization by getting an ROI of 800 percent. This ROI comes not in months or years but just in 60 days of implementation.

    Data Accuracy Platform from Naveego

    Definitely, data is increasing in organizations at an exponential. All this is happening due to the inclusion of projects related to AI (artificial intelligence), ML (Machine Learning), IoT (Internet of Things), mobility, intelligent devices, autonomous devices, etc. These are creating a tremendous array of data streams that is becoming difficult to handle. In fact, storage is not an issue. Because any organizations and just spend some money to buy storage space either on cloud or on-premise. It is actually the fruits that an organization must be able to reap out of this huge data that most of the organizations are not able to. That is where Naveego’s Data Accuracy Platform 2.0 with a unique and state of the art solution. There are various reports from reliable sources that prove how organizations are vetting losses on various fronts because of this data which needs immediate attention.

    Data Accuracy Platform From Naveego

    For instance, the U.S. economy is incurring a straight loss of over $3.1 million every year as poor data quality cost. Data cleansing is becoming a big pain for organizations. It was not there earlier because of a very few applications in the organization. Those all used to be in structured data form.

    We continue on the same topic in our next post… Continued »


    October 2, 2018  4:42 PM

    Data Accuracy Platform 2.0 by Naveego Brings A Revolution – 1

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Data Management, Data quality, Master data management

    Naveego announces the Company’s Data Accuracy Platform 2.0. It brings a new revolution in providing a 360-degree view of information assets across all business data platforms. These include all business data like ERP (Enterprise Resource Planning, CRM (Customer Relations Management), and MDM (Master Data Management) systems. The real value comes from its power to support new data from web searches, IoT Sensors, clickstreams, streaming data, and social media. Now, that is what I call as a real revolution in terms of empowering a business to get a holistic view of all relevant data one place. Because only then it makes a real sense for businesses to proactively achieve global data health in a single view. That happens by connecting all source of business data into a single view. All this happens by taking care of the basic things in mind that are critical for any business in this regard.

    Data Accuracy Platform

    Data Accuracy Platform 2.0 ensures to be simple to use, keeping data accuracy and consistency across the enterprise, and is securely accessible for the key business decision makers. This is, in fact, next-generation data quality and master data management solution. Amazingly, it promises to deliver 800% ROI (Return On Investment) within 60 days versus years. Most of the enterprises keep accumulating their data and investing in storage and infra without actually transforming it into actionable information assets across multiple systems. That is what this new Data Accuracy Platform promises to deliver. In fact, it does this in a simple self-service implementation model. This model empowers customers to deploy the solution within a few hours thereby saving a huge cost of deployment and customization. Looking at the new requirements emerging because of big data and social media, organizations are not able to find a right solution to tackle this situation.

    Data Accuracy Platform Brings A New Revolution

    We shall continue in the next post… Continued »


    September 30, 2018  9:38 PM

    Cloud-to-Local Backup Puts You in Control of Your Cloud Data – 4

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud Backup

    Q: Yes, you mentioned that the solution can save as much as 75 percent in costs—how does it achieve these cost savings for customers? What is the pricing model?

    LC: Cloud-to-cloud solutions require ongoing costs for cold-storing your data in the cloud or a C2C location where costs can scale very quickly. The local storage capability of BackupAssist 365 eliminates this need, so that’s where it can save up to 75 percent over cloud-to-cloud. We use yearly subscription licensing based on the number of unique user accounts or email addresses—so the more users subscribed, the larger the discount. Our pay-per-user-account pricing model begins at $1.00 per user per month for up to 24 users. Volume subscription pricing is also available. So if you have 25 to 49 users, the rate is 95 cents per month per user, and for 50 or more users, the price drops to 90 cents per month per user. So you can see that the solution puts the control back in the customer’s hands financially, as well as practically, through use of on-premise backups. Really, BackupAssist 365 gives SMBs the best of both worlds: the benefits and convenience of cloud storage for primary operations, plus gaining control of their data through local backups rather than being at the mercy of a third-party cloud vendor.


    September 30, 2018  9:37 PM

    Cloud-to-Local Backup Puts You in Control of Your Cloud Data – 3

    Jaideep Khanduja Jaideep Khanduja Profile: Jaideep Khanduja
    Cloud Backup

    Q: How does this solution handle the types of problems you mentioned, such as when an organization’s data is lost by something as commonplace as an accidental deletion, or more sinister like a ransomware attack?

    LC: Accidental deletion comes down to the risk of human error. Because BackupAssist 365 is fully automated, it removes the risk of human error while saving time and freeing up IT to focus on other priorities. The solution also provides ransomware protection, since it’s easy to create a local backup of cloud-based data that are “air-gapped” while facilitating quick recovery if and when it’s required. BackupAssist’s latest solution is a step ahead of anything currently on the market in that it’s the only solution that enables organizations to build what’s in essence a 360-degree shield against threats.

    So in addition to the cloud file to local backup capability—where you can back up your files from Google Drive, Dropbox, OneDrive, SFTP, or WebDAV to a local destination—you can also backup all of your mailbox data from Office 365, Gmail, Outlook, Rackspace, Exchange, or IMAP to a local destination. What’s more, the solution offers automation for easy backup scheduling, plus the tools that SMBs need for data compliance, because it gives organizations a local and controlled copy of their own data for compliance purposes. BackupAssist 365 also helps enterprises manage legal and accounting requirements: simply place a copy of the backup in an archive, and this will help with compliance years down the road. Then there’s the ransomware defense that I mentioned earlier—since files are backed up to a local destination, they’re never lost even if you’re a victim of a ransomware attack. And last but not least, you save significantly on storage cost by storing locally.


    Forgot Password

    No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

    Your password has been sent to: