September 13, 2012 8:56 AM
Posted by: Michael Tidmarsh
, Data Protection
Post by Al Perlman
The latest round of the Quantum-sponsored event series Virtualization, Cloud and The New Realities for Data Protection is coming to a close on Thursday evening with the final event in King of Prussia, Pa. If you are in the Philadelphia area or otherwise within reach, it is an event definitely worth attending, particularly if you have any questions about data protection and the cloud. And, face it, if you are in IT today can you afford not to have questions about data protection and the cloud?
The two latest meetings in St. Louis and Portland, OR, earlier this week have been enlivened by the recent introduction of Quantum’s Q-Cloud, which enables customers to purchase business-class backup and recovery solutions as a service-based subscription model. There are several key aspects of Q-Cloud that have been gathering the interest of participants at the event, notably:
- Pricing: Depending upon your configuration and features such as deduplication, your backup storage costs with Q-Cloud could be as low as a penny a gigabyte per month, which is unprecedented for the performance and peace of mind enabled for business-class backup and recovery solutions.
- Simple Integration: Q-Cloud integrates seamlessly with existing solutions without the need for any modifications. This is critical for IT and storage professionals looking to reduce complexity and it enables organizations to have a broader range of fully integrated solutions for all of their various storage requirements.
As the event series has moved across the country, participants all over the U.S. have talked about the costs involved in data protection and have been looking for ways to minimize costs while maximizing data safety. An advantage of taking an approach that incorporates Q-Cloud as part of an overall strategic storage solution, is in helping IT to manage data based on its value to the organization, particularly for backup and recovery. By using an inexpensive cloud-based solution for backup and archiving, organizations can save money on what would perhaps go towards disk and tape libraries, while having the peace of mind that they can easily get to their data if and when they need it.
“One of the big things we’re talking about at these events is to look at your environment and avoid treating all of your data the same,” says Greg Schulz, the widely respected storage analyst who has been the keynote speaker at all of the events in the series. Schulz, the founder and senior advisor at The Server and StorageIO Group (StorageIO), says organizations deploying a cost-saving tool such as Q-Cloud can address other issues within their storage infrastructure. “Budgets are definitely a factor,” he says. “We are seeing that people can use the savings derived from removing complexity and more effective data protection and reinvesting in infrastructure to support continued growth and improvements.”
Saving money – or spending less money – on backup and recovery will be critical as organizations cope with ever increasing amounts of data. If you look at some of the market research on how quickly and massively data is growing, it is absolutely mind-boggling. One of the studies I like to quote comes from Smithsonian Magazine: By next year we will be generating as much data as was produced in the entire history of mankind from the beginning of time to 2003 – and we will be producing that amount of data every 10 minutes. The more organizations can save on backing up that data – savings not just in costs but also in complexity, space and manpower – the better off we will all be in dealing with data as it just keeps growing and growing and growing.
What’s next? Well, as noted, there is one more event in the series, which means we’ll have one more blog post to wrap things up. Stay tuned early next week for our final post in this series. In the meantime, check out the latest information on Q-Cloud.
September 6, 2012 11:36 AM
Posted by: Michael Tidmarsh
, Data Protection
Post by Al Perlman
The latest dinner event in the Quantum-sponsored series Virtualization, Cloud and the New Realities for Data Protection took place last week at the Mohegan Sun Casino in Uncasville, Ct. The location turned out to be pretty apt because one of the big issues that keeps coming up at these events is IT executives not wanting to gamble with their data protection, particularly in the cloud. One of the great things about the event series, which has now played in a dozen locations, is that the participants are able to voice their concerns and talk to one another – and the experts on hand – to figure out how to address the issues they have with protecting data in the cloud.
So what are some of those issues? Glad you asked.
Security is certainly one of the big issues. Participants looking at cloud solutions want to know how safe their data is in the cloud; how secure it is; is their information going to be protected against leaks; will it remain private; will they know exactly where it will be protected – even to the point of knowing in what country the data will be located; how reliable cloud service providers will be in terms of availability and performance; and what kind of scalability they can expect from their cloud solutions.
For Greg Schulz, the keynote speaker at the event series, these questions are critical steps in helping IT pros figure out how to address their data protection challenges in the cloud. “If you know what your concerns are, you can address them,” Schulz says. “You can map out your challenges and then address what you need to do and what you may need to work around. The key is don’t be scared – do your homework. If you’ve read about someone else’s data being compromised, find out why. Remember, the cloud is a shared responsibility. Yes the service provider has responsibilities to living up to your SLAs, but as the customer you have the responsibility of choosing the SLAs.”
One of the tradeoffs in choosing a cloud supplier is in price and capacity versus performance. One of the participants at Mohegan Sun brought up the Amazon Glacier data archiving storage infrastructure and it’s low cost of approximately a penny a gigabyte per month. Schulz pointed out that the at this price you are getting “very, very, very low performance.” He said in looking at data protection strategies in the cloud, it’s important to look beyond the sheer cost per gigabyte and look at your service objective: What do you need to accomplish and what level of performance do you require for your goals. One of the larger points that came up at Mohegan Sun was to look beyond costs and even look beyond availability and focus on what you actually get for your response time. There are other areas where you can save costs – for example, focusing on changing how often you are backing up data. One way to save money is to look only at data that is changing for regular backup: By only focusing on this data you can continue to enhance availability and quality of service without necessarily spending more money.
The theme and played very well for Quantum, which two days before the event announced it’s Q-Cloud, which – unlike Amazon Glacier – is an enterprise-performance backup solution that provides cost-competitive storage with much higher performance and reliability for business uses. Quantum has characterized Q-Cloud as a “business-class solution” as opposed to Amazon Glacier as “economy class.” Q-Cloud is setting new standards for business-level backup and archiving, combining on-premise storage with cloud storage. How can Q-Cloud help you deal with the “new realities for data protection?” Tune in next week for our next blog post.
Meanwhile, the event series continues next week in three more cities: Monday, Sept. 10 in St.. Louis; Tuesday, Sept. 11 in Portland, Ore.; and Thursday, September 13 in King of Prussia, PA. If you are close to any of these cities and you have any concerns about data backup, this is your chance to get free expert advice and talk to your peers. Register here.
August 29, 2012 11:25 AM
Posted by: Michael Tidmarsh
Post by Al Perlman
Big Data is one of the megatrends driving next-generation computing. The research firm IDC characterizes Big Data analytics as one of the four pillars of the computing industry’s Third Platform, along with cloud services, mobile computing and social networking. McKinsey & Company describes Big Data as “the next frontier for innovation, competition and productivity.” Exactly who coined the term Big Data and when is up for dispute, but it’s really only been in the past few years that Big Data has become part of the consciousness and lexicon of IT professionals.
We bring this up because the concept of Big Data precipitated an interesting discussion following one of the dinner meetings last week in the Quantum event series titled Virtualization, Cloud and the New Realities for Data Protection. At the event the keynote speaker Greg Schulz was talking about Big Data in relation to the challenge expressed by many in the audience of needing to protect data – all data – with certainty. Schulz, who is a widely respected expert on storage and is the founder and senior advisor at The Server and StorageIO Group (StorageIO), was making the point that the issue of “certainty” in data protection played well into the Quantum story. The point was that Quantum has been protecting data with certainty for many years – in the cloud, outside of the cloud, on tape, on disk, on “little” data and on Big Data.
Following the event – on a shuttle bus, in fact – the discussion that began at the event was continuing full force. One of the participants raised the point that Big Data is a relatively new phenomenon so how could Schulz state that Quantum had been doing Big Data for years. Schulz asked the person who raised the question where he worked and what he did. Turns out he was the CIO at a company in geology. Schulz chuckled and mentioned activities such as seismic geosearch, mapping surveillance, geo-mapping – all driven by the concepts of Big Data analytics where massive amounts of data must be analyzed quickly and the storage solution must be high capacity, fast, accurate and reliable. “If those activities are not Big Data,” Schulz said, “I don’t know what on Planet Earth is.”
The point is that sometimes we get lost in the terminology and forget about the functionality we are trying to deliver. At these events, when Schulz asks participants whether Big Data applies to their environments, invariably many hands start going up when he describes the characteristics of Big Data and what it really means. There are many industries and applications that have been doing Big Data analytics for a long time – in CAD, government, in video and entertainment, healthcare, in scientific research, among many others. And for many of these organizations, Quantum has been the go-to supplier for housing, accessing and protecting their most challenging “Big Data” storage requirements.
The other reason we are relating this particular story is to try to convey the level of engagement that is taking place among participants at this event series, consistently vibrant and active across all of the various cities where events have taken place. During the dinner meetings, people are commenting, sharing stories, sharing experiences and sharing concerns. And then, when the formal meetings are over, they are sticking around, continuing their discussions, talking to peers, talking to some of the Quantum people and partners, keeping the conversations going.
“I’ve done a lot of these types of events through the years,” Schulz says, “and I find it’s always important to measure the success not just by the number of people that attend, but the level of engagement. At the events in this series, you can see how much the attendees are enjoying being there because of their level of engagement. When the events are done, people are not racing for the doors – they are staying, talking to one another, having follow-up conversations. And they are thanking us for facilitating a conversation and discussion, as opposed to a more traditional ‘death by PowerPoint.’”
The event series continues tonight with a dinner meeting in Uncasville, CT., at the Mohegan Sun Casino, which is convenient to many cities in Southern New England, including Hartford, New Haven, Providence and Boston. Then there is a one-week break before the meetings pick up again in St Louis on September 10; Portland, OR, on September 11 and King of Prussia, PA, on September 13. If any of these cities are convenient to you, it’s an event series worth checking out. Not only that, it’s free of charge. Register here.
August 17, 2012 2:55 PM
Posted by: Michael Tidmarsh
The Quantum event series Virtualization, Cloud and the New Realities for Data Protection has continued in full force over the past week and a half, with well-attended dinner meetings among senior-level IT professionals in Cleveland, Raleigh, Atlanta and, just last night, in Washington, DC. As has been the case with previous events in the series in Boston, New York, Chicago, Palo Alto and Houston, the discussions have been lively and engaging, focusing on some of the hot-button issues for IT professionals around data protection in environments that are increasingly virtualized and moving more towards cloud-computing deployments.
The events feature as keynote speaker Greg Shulz, founder and senior adviser at The Server and StorageIO Group (Storage IO). Shulz is an expert analyst with more than 30 years experience across applications, archiving, backup, business continuity, disaster recovery, performance and capacity planning, cloud computing and virtualization. Other speakers at the events include David Chapa, Chief Technology Evangelist at Quantum; Henrik Rosendahl, Vice President, Cloud Solutions, Quantum; Dan Duperron, Principal Technical Advisor, Quantum. This series will continue with dinner meetings in San Diego on August 22; Los Angeles, August 23; Uncasville, CT (Mohegan Sun Casino), August 29; St Louis, MO, September 10; Portland, OR, September11; and King of Prussia, PA, September 13.
After participating in discussions at four events in quick succession during the past week, Shulz said he is finding a lot of consistency in the concerns and questions of participants across geographies and even across industries. We asked him to summarize some of his observations from the meets in Cleveland, Raleigh, Atlanta and DC. Here is his response:
“I’m seeing continued strong interest in learning more about modernizing data protection. IT professionals want to move from swapping out media (e.g. tape, disk, cloud) like flat tires to finding and fixing issues and problems, along with removing complexity and cost vs. simple cost cutting or doing more with less.
“The cloud conversations continue around confidence in the services, including security, compliance, availability and cost – as well as how to use them, what to use them for, how to gain confidence and how to gain certainty. Some are using clouds, including hybrid, public or private, and some can only use private clouds today while others are exploring hybrid or public solutions. I’m seeing varying degrees of virtualization, which demonstrates that the market is far from reaching its full potential for virtual server deployment, both for consolidation as well as for the next wave of virtualization – including life beyond consolidation where virtualization is used for agility, flexibility and mobility as opposed to just for squeezing or reducing costs.
“Network bandwidth remains a concern for some individuals, particularly around issues such as moving or retrieving data to and from an off site or a cloud provider in a given amount of time. I’m seeing that some budgets are flat, some are increased, but all have more work to do, hence they are rethinking how, why, where and when data gets protected to remove costs, as opposed to cutting costs. I’m also seeing discussions around avoiding treating all applications and their data in the same manner as an effective means to reduce complexity and cost, as well as an alternative enabling organizations to do more with their resources, including people, budgets, available time, hardware, software, networks and services.”
Please let us know if you are facing the same backup and data protection challenges in your organization and what you are doing about it. Also, please stay tuned for our next update in the series next week as we move back to the West Coast.
July 2, 2012 1:38 PM
Posted by: Michael Tidmarsh
The event series Virtualization, Cloud and New Realities for Data Protection has now been completed to a chorus of appreciative accolades from excited participants. With strong attendance at the final event in New York last week, the series of five informal dinner gatherings across the U.S. attracted more than 100 IT professionals across a wide range of industries, all dealing with the challenges of handling data protection during a time of overwhelming data growth.
Across the event series, across the cities – from Boston to Chicago to Palo Alto, Houston and New York – there were certain themes that kept recurring for organizations dealing with data protection in today’s environment, particularly in addressing changes being wrought by the expansion of virtualization and cloud computing. Among the consistent and compelling themes were:
- Trust: This one kept coming up over and over again. As noted by storage expert Greg Shulz, who was the keynote speaker at all five events, “In the cloud, people were saying they don’t want to just blindly trust their data to anybody.” As organizations shift to cloud environments – whether public, private or hybrid – they put themselves in the position where their entire operations are dependent upon the availability of the cloud. Downtime, data loss, or other disruptions can be devastating. They are looking for “certainty” in their data protection solutions.
- Productivity Improvements: The challenges these days tend to get intertwined – not only are organizations creating and saving more data than ever, they are also facing tight budgets so they have to be far more efficient in utilizing modern storage solutions to keep costs down, enable scalability and support enhanced agility. With the growth of virtualization and the cloud, the issues are moving towards strategic discussions around how virtual machines and clouds can enable portability, flexibility, backup, disaster recovery and high availability.
- Identifying the Source of the Challenge: This is one of the points that really resonated, according to Shulz, who is founder and senior advisor at The Server and StorageIO Group (StorageIO). He likened data protection to driving a car – if you keep getting flat tires, rather than just changing the tires again and again, you have to find out why they keep getting flat and address the underlying challenge. Shulz advised to “move upstream” to find the source of the problem and then “come downstream” to apply the right technology solutions, whether they are deduplication, compressing, tiering, archiving or others.
The other consistent them across the event series was the strength of Quantum in addressing the unique challenges involved in “modernizing” data protection to address the new realities inherent in supporting virtualization and cloud computing. David Chapa, chief technology evangelist at Quantum was also a presenter at the dinner meetings and he focused on how the Quantum family of data protection solutions delivers “certainty” in dealing with today’s most pressing challenges, including:
- Big Data with tiered solutions and wide-area storage that enable organizations to maximize revenue and results by extracting full value from their data, no matter where it is in its lifecycle.
- Data Protection with deduplication systems and tape libraries that optimize backup and recovery, simplify management and lower costs. Also with all-in-one virtual server backup and disaster recovery solutions that protect virtual environments while minimizing impact to servers and storage.
- Cloud Computing, with a wide range of technology solutions that enable cloud backup and disaster recovery for physical and virtual servers.
As part of the presentations, Chapa pointed to several organizations that have been able to successfully deploy data protection solutions from Quantum to improve performance and save money in virtualization and cloud computing use cases. The Bend Memorial Clinic in Bend, Ore., is using a Quantum N-tier solution consisting of DXi deduplication, Scalar library, vmPRO software and Vision management. With the new solution in place, backups are six times faster and IT is delivering a four-fold improvement in productivity. At Xerox, Quantum has enabled cloud backup and disaster recovery through Quantum DXi V1000 virtual deduplication appliances and vmPRO data protection. With the Quantum solution in place, Xerox has been able to capitalize on a new market opportunity with the “certainty” of enterprise-class backup and disaster recovery.
Although the event series is over, the themes that brought all of these IT professionals together will continue to resonate for a long time to come. Data growth is only going to get more challenging, and the opportunities creating by virtualization and cloud computing will continue to create new issues for organizations of all types. Fortunately, visionary vendors such as Quantum not only have the advanced solutions available to address your most difficult challenges, they are also committed to listening and creating unique forums such as these to discuss what it takes to enable the new realities of virtualization and cloud computing to reach their full potential. Keep an eye out for further educational opportunities from Quantum and check out their industry-leading solutions for modernizing data protection.
June 29, 2012 12:48 PM
Posted by: Michael Tidmarsh
Despite 100-degree temperatures and tropical storm warnings in the Gulf, approximately 35 IT professionals turned out for Tuesday’s dinner roundtable discussion in Houston on Virtualization, Cloud and New Realities for Data Protection. Indeed, with the threat of data loss due to weather-related disasters, perhaps the turbulent weather was an incentive for individuals to hear about innovative ways in which to modernize their data protection strategies.
With a broad range of individuals from industries such as energy and healthcare, the discussion quickly turned to challenges involved in managing data growth – not just for Big Data, which is certainly an important topic, but for all types of data, including the massive amount of unstructured data that is overwhelming IT infrastructures across all industries. By modernizing your storage infrastructure with critical tools such as deduplication, storage tiering, archiving and others, you have the opportunity to manage growth strategically and actually improve overall performance in your data centers.
So how do you get there from here?
Keynote speaker Greg Schulz, a widely known storage expert who is the founder and senior advisor at The Server and StorageIO Group (StorageIO) offered attendees a five-point plan to prepare and plan for the journey to a more modernized approach to data protection. What are the key steps?
- Have a vision, strategy and plan – for example, an itinerary and road map.
- Do more with what you have and boost productivity where needed.
- Reduce the cost of doing things (i.e., manual operations) and service delivery versus cost cutting.
- Reduce your data footprint impact. In other words, pack smartly for your journey.
- Modernize data protection, including high availability, disaster recovery, backup and archiving and business continuity.
What are the technologies that will help you successfully accomplish your goals? And why is Quantum the perfect partner to help you on this journey? Please stay tuned for our next and final blog post from this series for the answers to these and other important data protection questions.
In the meantime, there is still one more change to participate live: The final dinner gathering of the series will take place this evening in New York City at Morton’s Steakhouse. If you would like to register you can find the information at Virtualization, Cloud and New Realities for Data Protection.
June 26, 2012 9:29 AM
Posted by: Networking123
Bend Memorial Clinic
, Cloud Computing
, Dave Chapa
, Greg Schulz
Tonight is the fourth in Quantum’s series of dinner roundtable discussions on Virtualization, Cloud and the New Realities for Data Protection and, if it’s anything like the first three gatherings, it promises to be a compelling and freewheeling affair. Tonight’s dinner takes place in Houston, following similar events in Boston, Chicago and Palo Alto. The series concludes on Thursday with a dinner meeting in New York.
Last Thursday’s meeting in Palo Alto had a strong turnout, with around 25 attendees from a wide range of organizations – large and small environments, private and public sector, with a lot of interaction and discussion. One of the interesting topics was virtualization beyond consolidation – where the focus expands from consolidation to agility. As virtualization becomes more widely accepted and becomes more of a platform for cloud-based initiatives, the issues move from ratios of physical to virtual machines, to strategic discussions around how VMs can enable portability, flexibility, backup, disaster recovery and high availability.
As in the earlier gatherings, when the topic turned to backup in and cloud computing, trust was a key issue. “People wanted to talk about trust and confidence in clouds, including the reality that cloud data protection is a shared responsibility and that only you can prevent cloud data loss based on the services that you choose,” said storage expert Greg Shulz, who is the keynote presenter for all of these events and is the founder and senior advisor at The Server and StorageIO Group (StorageIO).
Also participating in the discussions are thought leaders from Quantum, including David Chapa, Chief Technology Evangelist. Because the trust issue keeps coming up in discussions about the cloud, it is fitting that the theme for Chapa’s presentation focuses on “Certainty in a Cloud Environment.” The presentation points to two specific use cases where Quantum has been able to work with customers to improve performance, productivity and trust in both virtualized and cloud environments. These are:
- Bend Memorial Clinic, in Bend, Oregon. BMC’s challenge was the volume of data growth was outgrowing the ability of the IT staff to manage it. It turned to a virtualized environment, but sought a cost-effective solution that would protect both physical and virtual environments. It chose a Quantum N-tier solution consisting of DXi deduplication, Scalar library, vmPRO software and Vision management. The results: Backups that are six times faster, IT productivity has quadrupled, and the time required to do restores has been reduced from days to minutes.
- Xerox was looking for virtual backup and recovery technology to enable its cloud backup and disaster recovery services. It chose Quantum DXi V1000 virtual deduplication appliances and vmPRO data protection to enable Xerox cloud services. The result: With enterprise-class backup and disaster recovery, Xerox has been able to capitalize on a massive new market opportunity enabled by the cloud.
How can some of these technology solutions enable you to deal with the new realities in data protection engendered by virtualization and the cloud? Tune in next time when we review the discussion of tonight’s event in Houston in our ongoing series on Virtualization, Cloud and the New Realities for Data Protection.
June 20, 2012 7:19 AM
Posted by: Networking123
, Data Protection
, Dave Chapa
, Greg Schulz
Quantum hosted its second informal data protection dinner meeting among IT professionals on Monday night in Chicago and the discussion was quite animated – particularly when it came to cloud computing and what to do about storage and backup in the cloud. The event was titled Virtualization, Cloud and the New Realities for Data Protection and it certainly lived up to its theme.
Keynote speaker and storage expert Greg Schulz, founder and senior advisor at The Server and StorageIO Group (StorageIO), said one of the important issues among the more than 20 participants was “trust,” particularly when it came to trusting protection of their mission critical data in the cloud.
“In the cloud, people were saying they don’t want to just blindly trust their data to anybody,” Schulz said. “They want to have trust in their provider, their services, their tools, their capabilities. Protecting data in the cloud is a shared responsibility. Service providers have a role, but users have a complementary role in how they use the service provider, configuring their systems, running pilot programs and utilizing best practices. All of the major themes resonated – disaster recovery, backup, archiving – and there was a lot of cloud confusion.”
Schulz said one of the participants described a widely know situation where Amazon’s cloud service went down. One customer, in particular, did not go down with Amazon because it was using the Amazon service more strategically – as an adjunct to its overall storage infrastructure as part of an effort to increase availability. And it worked.
“A lot of companies are just going to the cloud because it may be cheaper than doing it themselves,” Schulz said. “If that’s the primary reason for going to the cloud, you’re probably missing bigger opportunities. If you use the cloud as a complement, you can increase availability. The key is to determine how you will use the resources available to you – including the cloud – to improve upon what you are currently doing.”
As Schulz noted, the discussion set up perfectly for the presenters from Quantum because their flexible and innovative approach to the cloud gives their customers all kinds of options for data protection. “Quantum’s story plays very well with the ‘trust’ factor because they can do it all – physical, virtual and also in the cloud,” Schulz said. In fact, the theme of the presentation by Quantum Chief Technology Evangelist David Chapa was “Quantum Certainty.”
Chapa’s presentation focused on Quantum solutions for big data, data protection and the cloud, with virtualization as a recurring theme because it is central to all of these trends. In fact, one of the key points of Chapa’s presentation was that all of these trends are converging quickly and customers are finding that they have to combine products and services together to solve their data management challenges.
In an era where trust is an issue, Quantum presented two case studies where customers have been able to achieve certainty in both a virtualized environment and in a cloud environment.
How were they able to achieve certainty? Tune in next time when we post our third post in this series following the next informal dinner meeting, which will take place on Thursday evening in Palo Alto at MacArthur Park Restaurant. There’s still time to sign up at Virtualization, Cloud and the New Realities for Data Protection.
June 15, 2012 9:10 AM
Posted by: Networking123
, Data Protection
, Greg Schulz
Data protection has never been more important, or more challenging. Organizations are generating and storing more data than ever and the volume just keeps getting bigger and bigger. With the broader acceptance of virtualization and cloud computing, the challenges are only getting more intense.
So what do you do about backup in this environment and how do you incorporate technologies that will help you to “future proof” your IT infrastructures? Those were among the questions addressed this week at an informal dinner gathering of about 20 IT professionals in Boston. The event was sponsored by Quantum and featured the storage expert Greg Schulz as a keynote speaker.
Schulz, founder and senior advisor at The Server and StorageIO Group (StorageIO), told the gathering that it’s time to start thinking in terms of modernizing your data protection schemes. In order to rethink your strategy, you have to set the proper stage and really understand your data: How much are you generating, when are you generating it, how long are you keeping it, how much of it is in physical machines, how much is virtual, how does the cloud fit into your overall strategy.
“When it comes to data protection, you want to make sure you are shifting your focus to the source of the problem,” Schulz said. “If you’re driving a car and you keep getting flat tires, instead of constantly changing the tires, you want to get to the root of the problem and find out why and how it’s happening.”
The idea, Schulz said, is to “move upstream” to find the source of the problem and then “come downstream” to apply the right technology solutions, whether they are deduplication, compression, tiering, archiving. There are a wide range of potential solutions, several of which were described in presentations from Quantum executives. Quantum also discussed some interesting use cases involving key customers.
What are some of those solutions and use cases? And how do you plan for the journey to modernizing your backup? Tune in next week when we will post our second update following the next informal dinner meeting in Chicago, which will take place on Monday evening. Other upcoming events in the series are scheduled for Palo Alto, Houston and New York. For details go to Virtualization, Cloud and the New Realities For Data Protection.