Innovation Through Infrastructure


June 23, 2017  3:46 PM

Top 5 reasons SMBs should invest in software-defined hybrid cloud storage

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Cloud storage, Hybrid cloud

IT storage has become a critical decision point for small and midsize businesses (SMBs). Without the right storage solutions, SMBs run the risk of being overwhelmed by spiraling data growth, which can negatively impact IT costs, complexity, security and availability. At the same time, SMBs face the same pressures as larger companies to use data strategically to drive business operations, improve the customer experience and support digital transformation.

SMBs typically have the added challenge of tighter IT budgets and less IT personnel resources than enterprise companies. Therefore, the importance of managing return on investment (ROI) and delivering measurable business benefits can be even greater for IT decision-makers in SMB environments. Storage is a particularly important investment because data has become the lifeblood of most businesses and is growing significantly in volume, velocity and variety.

To address these issues, many SMBs have turned to public cloud storage options. However, using just the public cloud can be a challenge because IT has less control over costs, performance, security and compliance. In addition, if the public cloud storage is not centrally managed, the organisation can face additional risks caused by shadow IT. For those reasons and others, approximately 80% of midmarket customers are looking at how to integrate cloud-based storage with on-premises storage.[1]

The hybrid cloud storage environment created by the mix of public cloud and on-premises storage allows SMBs to maximise ROI while delivering the performance, capacity, scalability and flexibility required by developers, users and line-of-business managers throughout their organisations. The top five benefits of hybrid cloud storage for SMBs are:

  1. Improved ROI: With hybrid cloud storage, IT teams can manage costs more strategically, using the strengths of public cloud and on-premises infrastructure to their best advantage. For example, public cloud can be used to support seasonal spikes in demand and new infrastructure for DevOps, while on-premises storage can be used for applications where IT needs to maintain strict control over security and data protection. Both environments can be centrally managed from the same platform.
  1. Increased agility: With the right solution, IT organisations can use the storage platform of their choice, including all-flash storage or hybrid flash/spinning disk. They can also leverage features such as intelligent tiering and hybrid cloud object storage to maximise their investment. When IT is more agile, businesses benefit in a number of ways, including increased productivity, higher customer satisfaction, improved quality assurance and a greater ability to strategically leverage data analytics to meet the needs of employees, partners and customers.
  1. Reduced risk: SMBs can take more control over security, compliance and disaster recovery in a hybrid cloud storage environment. Hybrid cloud gives IT teams more options to support best practices in backup and reduce recovery time objectives and recovery point objectives. With disaster recovery as a service and backup as a service, IT teams can reduce overall storage costs and improve performance of production storage environments.
  1. Reduced complexity: With a software-defined model, IT teams can use automation and orchestration features in the management platform to streamline deployments, leverage shared resources and utilise the self-service capabilities of cloud environments. Software-defined hybrid storage solutions make scaling much simpler, and overall management is much less of a strain on IT resources—SMBs can have a single person managing the entire storage infrastructure from a single pane of glass.
  1. Ability to future-proof the business: SMBs are subject to the same volatility as larger organisations. Initiatives such as the Internet of Things, big data analytics and digital transformation are disrupting business processes across many industries. By leveraging a hybrid cloud storage platform, SMBs give themselves the best opportunity to maximise their storage investment over the next five years and beyond. The key to maximising hybrid cloud storage is to use a software-defined architecture. As noted by Neuralytix, “SDS provides a flexibility and consistency to span different deployment models with a consistent operational experience.”[2]

Conclusion

SMBs need to focus on the bottom line. They must invest in technologies that help them make more money now, while supporting the business through the unanticipated changes—and challenges—looming on the horizon. Software-defined hybrid cloud storage enables SMBs to improve ROI today with a platform that also prepares them for the future. It’s time to take the next step forward.

[1]SDS: The Road to a Hybrid Cloud Storage Strategy,” Neuralytix, May 2017

[2] Ibid

June 23, 2017  12:54 PM

Improving SAP HANA Performance, Resiliency and Flexibility: It’s a lot easier than you think

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
SAP HANA

Early adopters of the SAP HANA in-memory database were stuck with one infrastructure platform option: industry-standard x86 servers. As some of these organisations have discovered the hard way, limited options have left them with limited performance, resiliency and flexibility.

These limitations are becoming problematic for many customers. It’s been more than five years since SAP HANA arrived on the scene, and it is taking on a much more important role in driving innovation and competitive differentiation for many organisations. Some may be new to SAP HANA, while others have older x86 implementations. But wherever they are coming from, organisations need to get more out of their SAP HANA investments.

There is a better way. Companies are no longer confined to using x86 servers for SAP HANA. They can now use more powerful enterprise-grade Linux servers. This is an important development in the evolution of the SAP HANA market. By leveraging more powerful servers, IT teams can boost performance, reliability and flexibility while lowering operational costs.

What kind of an impact can IT make with more powerful servers? Some companies see a price/performance gain of as much as 80% using IBM Power Systems versus servers built on Intel Ivy Bridge EX. They also benefit from twice as many transactions per core, four times as many simultaneous threads per core, and about four times the memory bandwidth.

Cost savings are another benefit. IT can run as many as eight SAP HANA instances on a single server, versus two on an x86 server. This provides a greater ability to run mixed workloads and reduces energy, space and maintenance costs. It also reduces scale-out sprawl for organisations that need to refresh older SAP HANA appliances. There is also more flexibility, particularly when using Tailored Data Center Integration (TDI), which is more versatile than an x86 appliance. With TDI, organisations can maximise the value of existing server, storage and networking assets.

Dispelling the complexity myth

The combination of these benefits leads to better business results, including faster and more accurate insights, better forecasting, improved customer service, greater support for individual lines of business, higher availability and myriad other competitive differentiators. Such gains are particularly important as data growth continues unabated and companies embrace big data analytics as part of their digital transformation endeavors.

As attractive as it may be to maximise the value of SAP HANA, there is still one obstacle some organisations must overcome: the inaccurate belief and unnecessary fear that deploying these more powerful servers—or migrating to them from an x86 appliance—is overly complex and requires resources that many IT teams don’t have.

This is simply not the case. In fact, with the open architecture of IBM Power Systems, combined with the high levels of support customers receive from IBM and SAP, IT teams will discover that once they make the move, they will actually reduce complexity in managing ongoing operations and scaling their deployments. These are the facts:

  • IBM Power Systems use the same Linux interface as x86 appliances, which means that the tasks involved in configuration are very similar.
  • By running SAP HANA on more instances on a single server, IT teams can scale far more easily, without the need to purchase, deploy and manage more hardware.
  • IT can increase physical and virtual capacity on demand through a cloud architecture that enables components to be added without disruption.
  • Greater reliability and resiliency increase availability and productivity, while reducing the amount of time IT teams must devote to troubleshooting and problem resolution.
  • Migrating an existing database is not nearly as difficult as you might imagine, and there is plenty of easily accessible support from SAP and its customers. In fact, IBM and SAP are both leaders in customer support and committed as partners to helping customers accelerate digital transformation with SAP HANA.
  • By using a more powerful, flexible, scalable and resilient server platform, you future-proof your SAP HANA deployment. Your team won’t have to go through a refresh cycle nearly as often, which saves time, hassle and the complexity involved in specifying, purchasing and deploying new systems every few years.

Conclusion

SAP HANA has the potential to be one of the most important drivers of innovation and competitive differentiation—now and in the future. Organisations can get far more value out of their deployments by using a server platform that delivers greater performance, resiliency and flexibility, specifically SAP HANA on IBM Power Systems.

With this solution, organisations can leverage the same architecture that enables top performing high-performance computing applications. SAP HANA on IBM Power Systems not only delivers greater performance, but is also simple to deploy, manage and scale.

If you are holding back from moving to a better platform over concerns about making a change, perhaps it’s time to re-examine things and re-evaluate the risks and rewards. You may find that what you now think of as a risk will actually turn out to be a benefit.


May 3, 2017  3:11 PM

How you can leverage hybrid cloud object storage to reduce complexity, increase agility and improve security

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Hybrid cloud, Object storage

IT teams are increasingly turning to hybrid cloud storage as a means to increase flexibility and agility, particularly in dealing with the challenges created by exponential data growth. At the same time, object storage has emerged as an important technology in managing and storing unstructured data in cloud environments.

While object storage has been integral to the development of the public cloud market, it’s usage has been more limited among enterprises and midsize businesses. That is partly because these customers have had only two options: They could use public cloud or on-premises object storage solutions.

Now, however, organizations can take advantage of object storage in hybrid cloud storage models thanks to the development of a new class of solution called hybrid cloud object storage. With hybrid cloud object storage, IT can deploy object storage on premises, in the public cloud and in a hybrid cloud, leveraging a streamlined, simplified object storage approach that uses the same technology wherever the data is stored.

Hybrid cloud object storage is an important breakthrough in the evolution of the hybrid cloud storage market because it provides a cost-efficient, agile and highly secure method for organizations to manage unstructured data and use that data for competitive advantage.

Object storage works particularly well in cloud use cases because it is much simpler to scale than traditional file and block storage solutions. It is also more flexible and manageable in handling large data volumes in multi-use cloud environments. Among the key advantages:

  • Greater flexibility: You can choose the deployment options that work best for your data workloads, moving easily between cloud and on-premises environments. You can also leverage highly adaptive object storage that you can scale and adjust by workload.
  • Improved security: IBM’s Cloud Object Storage (COS) solution incorporates a range of features designed to help organizations meet security and compliance requirements. These include built-in encryption of data at rest and in motion, as well as authentication and data control capabilities.
  • Increased scalability: Object storage has been utilized in the largest cloud environments in the world and is designed for virtually unlimited scalability. IBM COS software has been tried and tested at web-scale with production deployments exceeding 100 petabytes of capacity at multiple customers. It has the ability to scale to exabytes while maintaining reliability, availability, manageability and cost efficiencies.
  • Simplified manageability: Hybrid cloud object storage can provide always-on availability, supporting a wide range of tasks with virtually zero downtime. These include software upgrades, hardware maintenance, storage capacity expansion, hardware refreshes and physical relocation of the storage system.

Conclusion

If you’re not yet familiar with hybrid cloud object storage, it’s time to catch up. Hybrid cloud object storage has the potential to be an important technology innovation for enterprise IT, particularly as companies continue to generate more and more unstructured data.

The ability to use object storage on premises or in the cloud gives IT teams much more flexibility to enhance agility, increase availability, simplify management, strengthen security and dramatically improve scalability. When it comes to object storage in the hybrid cloud, the future is now.


May 3, 2017  3:08 PM

The real cost of flash storage…and the higher cost of not using flash

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
flash storage

Flash storage is rapidly displacing spinning disk drives for primary applications. IDC says 76% of enterprises plan to move more primary storage workloads into All Flash storage as legacy platforms come up for technology refresh.[1] 451 Research says almost 90% of organizations now have flash storage in their data centers while All Flash approaches “are becoming increasingly standard to support transactional applications.”[2]

Performance, of course, has been the main driver of the All Flash market. All Flash storage delivers orders-of-magnitude greater IOPS performance than spinning disks. With All Flash storage, organizations can modernize and upgrade their infrastructures to drive major business improvements and enable critical initiatives such as cloud computing, data analytics, mobile and social engagement, the Internet of Things (IoT) and security.

If there has been any impediment to the growth of the All Flash market it has been cost. All Flash storage has traditionally been more expensive than traditional spinning disks when measured on a per-gigabyte basis and, in some cases, IT decision-makers haven’t been able to justify the increased capital investment. Fair enough.

However, the dynamics are changing rapidly and dramatically. With significant price declines over the past few years, the cost of All Flash storage is approaching that of spinning disks—even on a per-gigabyte basis. Here’s what IDC has to say: “Continuing cost declines, coupled with flash-driven data reduction in primary storage environments in particular, should have the effective cost per gigabyte (the cost when factoring in storage efficiency technologies like data reduction) of enterprise-class flash media actually lower than 10,000-rpm and 15,000-rpm HDD raw capacity costs by the end of 2017 for most primary storage workloads.”

Even as the price gap closes, the reality is that per-gigabyte costs don’t even begin to measure the real incremental value that All Flash storage  delivers to today’s businesses when viewed through the lens of overall contribution to total cost of ownership (TCO).

And, with All Flash solutions that leverage modern software-defined storage architectures, such as the IBM FlashSystem V9000, the TCO advantages of All Flash storage are magnified even further. What are the TCO advantages of using software-defined All Flash storage arrays? Here are just a few:

  • Increase revenue: Organizations can process more transactions in shorter time frames and be far more responsive to the needs of customers, leveraging tools such as big data analytics and social networking.
  • Reduce costs and overhead: IT teams can reduce software licensing fees on databases and other applications, while lowering energy consumption costs and reducing the physical space required by the storage infrastructure.
  • Simplify IT: All Flash storage is typically much simpler to deploy, scale and manage than traditional spinning disks. With software-defined storage, IT teams can leverage automation and orchestration capabilities to reduce costs and risks.
  • Shrink the storage footprint: IT can leverage features such as virtualization, compression, data tiering, deduplication and data copy services to significantly reduce the storage footprint. This is critical in today’s era, where unabated data growth continues.
  • Accelerate time to market: Infrastructure can be deployed faster, which means users and IT can be more productive. DevOps teams can be faster and more efficient by processing larger and more complex datasets in shorter time periods to accelerate development and improve quality assurance.
  • Achieve better, faster, more accurate decision-making: With All Flash storage, the organization can be far more effective and efficient in using its data to drive real-time insights and decision-making.

When it comes to All Flash storage and TCO there’s a new reality for IT professionals these days: It is actually more expensive–  and risky – to not use All Flash storage for primary applications than it is to invest in the right software-defined All Flash storage platform.

[1]IDC’s Worldwide Flash in the Data Center Taxonomy 2017,” IDC, January 2017

[2]Flash-based approaches are increasingly becoming mainstream for primary storage,” 451 Research, June 2016


May 3, 2017  3:06 PM

Four key mistakes to avoid in implementing big data analytics

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Analytics, Big Data

Competitive advantage is increasingly determined by how quickly and effectively organizations can leverage advanced analytics and insights to drive measurable results. McKinsey describes this as “The Age of Analytics” and says the critical question facing companies today is how to “position themselves in a world where analytics can upend entire industries.”[1]

Despite the growing importance of big data analytics, many organizations are still figuring out how to maximize the value it can deliver across the enterprise. According to one survey, nearly 50% of big data decision-makers said they are not leveraging big data extensively across all business units.[2]

Where are they coming up short? Here are four mistakes that can impede big data analytics efforts:

  1. Not having a plan. Big data analytics is not just a technology issue; it is also cultural. Management needs to buy in, and new processes have to be baked into the culture. If you don’t have a strategic plan in place, you won’t know which technologies to invest in or be able to establish the necessary governance and data management policies and practices.
  1. Not focusing on talent. Big data analytics requires specific skill sets. The Harvard Business review has called data scientist “The sexiest job of the 21st century,” but, in reality, there is still a shortage of individuals with the knowledge and experience to drive enterprise-wide deployments. Make sure you either have the talent in house or are working with technology partners that can supplement the skills and experience of your own teams.
  1. Not modernizing your infrastructure. The need for speed and accuracy in analytics puts intense pressure on the underlying infrastructure, particularly for workloads that have larger datasets and growing volumes and varieties of data. This is one of the reasons why companies are rushing to embrace All Flash storage and hybrid cloud storage solutions.
  1. Not upgrading the server platform. Because the storage infrastructure is so central to the delivery of big data analytics, many IT leaders believe that if they invest in the right storage platform their infrastructure challenges will be addressed. This is not the case. Big data analytics also puts enormous pressure on the compute infrastructure for processing speed, reliability, operational simplicity and resiliency.

Leveraging modern servers for analytics workloads

Modernizing the server platform is one of the first steps that organizations can take to support big data analytics. As you build your strategic plan and bring on the requisite talent, having a modern server infrastructure in place will accelerate your ability to deliver real-time actionable insight to your managers, employees and customers.

Many organizations are finding that they can drive immediate performance gains in their analytics workloads by using integrated server and software platforms that have been designed specifically for big data environments.

As an example, IBM’s high-performance Power Systems Linux-based servers are now available in configurations designed specifically for big data and analytics workloads, including packages for SAP HANA, NoSQL, operational analytics, data warehouses and several others. Research shows that there are clear advantages to modernizing your analytics infrastructure with these types of solutions, including:

  • Increased performance: The IBM Power8 server has demonstrated 1.8x faster per-core performance for SAP HANA compared to x86 platforms, resulting in faster and more efficient analytics of business data and setting a world record in the 2-billion record category.
  • Accelerated time to value: Organizations can save setup time and maintenance costs by utilizing a complete pre-assembled infrastructure that has been designed specifically for analytics workloads with pre-installed and tested software.
  • High reliability and resiliency: As companies increase their reliance on analytics to drive business initiatives, uptime becomes more important than ever. IBM Power Systems are designed for 99.997% uptime and use self-monitoring and predictive failure alerts to pre-emptively migrate applications before failures occur.
  • Flexibility and agility: Organizations should be able to leverage either a scale-out or scale-up architecture for their analytics workloads, while also incorporating features such as server virtualization and support for multi-tenant cloud functionality.

Turning information into power

In the years ahead, competitive advantage will increasingly go to those organizations that are best able to use their information to create business value and drive innovation through real-time analytics and insight. The foundation IT puts in place today will go a long way in determining the success of the business in the future.

In order to put that foundation in place, IT leaders must address the potential pitfalls discussed in this article, namely:

  • Put a strategic plan in place.
  • Make sure you have the right talent, either on staff or through your tech partners.
  • Modernize your infrastructure.
  • Upgrade your server platform.

Any successful analytics initiative will be built on an infrastructure foundation that can deliver the requisite performance, capacity, scalability, reliability, resiliency and agility. As you’re building your plan and hiring the right talent, make sure you are investing in infrastructure solutions that will give you the best chance of success.

[1]The age of analytics: Competing in a data-driven world,” McKinsey& Company, December 2016

[2]Survey of Big Data Decision-Makers,” Attivio, May 25, 2016


May 3, 2017  2:58 PM

How transparent tiering can help you reduce complexity and increase agility in hybrid cloud storage

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Cloud storage, Hybrid cloud

Cloud computing has caused a seismic shift in how IT teams manage and deploy storage. The ability to quickly add storage resources in the public cloud—and to use cloud services for backup, archiving and disaster recovery—has create a groundswell of demand for cloud storage.

The overall cloud storage market reached nearly $24 billion in 2016 and will grow at a compound rate of nearly 26% a year through 2021, when sales are expected to reach a staggering $75 billion.[1]

However, while cloud storage has delivered significant value to many businesses, it has also created its fair share of risks and challenges— particularly for the IT teams charged with managing storage.

With public cloud, IT runs the risk of not having full control over key areas such as costs and performance. According to one survey, between 30% and 45% of the money spent on public cloud is wasted.[2] Security and compliance are also potential problem areas with public cloud, particularly when users and line-of-business managers deploy cloud services as shadow IT separate from the corporate IT department.

Why hybrid cloud storage?

For these and a host of other reasons, organizations are increasingly embracing hybrid cloud storage models that use a mix of public cloud services and on-premises storage infrastructure. Hybrid cloud storage gives IT more control over costs, performance, security and compliance.

Hybrid cloud storage also enables IT to be much more strategic in how, where and when it uses the public cloud to offload and augment important functions such as disaster-recovery-as-a-service or backup-as-a-service.

One of the fundamental benefits of hybrid cloud storage is that it provides infrastructure teams with another storage tier that can be deployed and scaled quickly, easily and strategically. Not only can public cloud be used for second-tier storage functions such as archiving, backup and recovery; it can also be used to scale production environments and support new initiatives or business-critical workloads, such as DevOps.

The benefits of transparent tiering

To achieve these benefits IT teams have had to work long and hard to manage cloud storage in conjunction with existing on-premises infrastructure. They have not had access to technology that would give them a simple and elegant way to transparently use public cloud storage with the same ease in which they use a local disk array in a hybrid cloud environment.

But that was then, and this is now, and now that technology is available.

It is called “transparent cloud tiering” and it is being offered for the first time in the IBM Spectrum Scale solution. With transparent cloud tiering, cloud storage becomes another software-defined storage tier on the menu, along with flash, disk and tape.

Intelligent tiering capabilities allow file, block and object stores to migrate non-disruptively among all tiers— based on information lifecycle management criteria and policies established by the IT department.

With transparent cloud tiering, enterprises can easily bridge storage silos on-premises while adding the benefits of cloud storage to their overall storage solutions. Transparent cloud tiering also reduces IT complexity and increases agility, giving IT more control over performance and security.

Conclusion

As the use of public cloud storage keeps growing, IT teams continue to look for simple ways to incorporate public cloud services into their overall hybrid cloud storage strategies. The availability of transparent cloud tiering is an important breakthrough in adding versatility, simplicity and control to hybrid cloud storage models.

[1]Cloud Storage Market Worth $74.94 Billion USD by 2021,” marketsandmarkets, September 2016

[2]2017 State of the Cloud Report,” Rightscale. Feb. 15, 2017


May 3, 2017  2:54 PM

All Flash all the time: Using All Flash storage to drive innovation, insight and cost savings

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
flash storage

Why is your organization using or considering All Flash storage?

  • Do you need to drive greater performance for transaction-oriented workloads?
  • Are users or customers complaining about slow service?
  • Are you looking to maximize your investment and save money as storage requirements continue to grow?
  • Are you looking to modernize legacy applications to leverage big data analytics, social networking or cloud computing?

As All Flash storage has proliferated over the past few years, organizations across all industries have deployed the technology to support a wide range of applications and workloads. All Flash storage not only delivers major performance gains, it also offers potential improvements in efficiency, agility, security, reliability and total cost of ownership (TCO).

For some organizations, All Flash storage has provided a path to achieving new capabilities that were simply not possible with legacy spinning disk arrays. Here are examples of companies from three different industries where All Flash storage has driven dramatic improvements in efficiencies and enabled new levels of innovation.

Healthcare: Healthcare IT is in the throes of dramatic change. The infrastructure must support electronic health record (EHR) systems that are growing in size and complexity. In addition, healthcare is an area where mobility, big data analytics and the Internet of Things are major drivers of IT modernization and innovation.

UF Health Shands Hospital, the teaching hospital for the University of Florida, needed to support new capabilities in its Epic EHR system and was up against the limitations of its existing spinning disk storage platform. The company upgraded to an All Flash solution from IBM while also leveraging the virtualization capabilities of IBM Spectrum Virtualize.

With the new solution, the hospital has been able to reduce the size of its Epic database by 61% and is achieving enormous cost savings in power and cooling devices. Performance is also much better: a 97% improvement in the runtime of Epic integrity checks and a 6X improvement in access times.

Financial Services: Many companies in financial services must deal with growing volumes of transactions and the need to deliver accurate, current information and services with no latency and no downtime. For Wanlian Securities Co., LTD., a state-owned securities company based in Guangshou, China, dramatic growth in transaction volumes created serious challenges for legacy infrastructure.

“Not only was our storage architecture struggling to support increasingly busy trading systems and query needs, but also we could foresee a risk of experiencing unacceptable downtime during market hours,” recalls Chen Zhaohua, Vice President, Technical Information Center Director, Wanlian Securities. “The situation was going to worsen as we predicted transaction volumes would grow from three trillion to five trillion.”

Wanlian upgraded its storage infrastructure with IBM FlashSystem arrays and IBM SAN Volume controller with IBM Spectrum Virtualize software. The results: a 100x acceleration of storage I/O, from 10 to 0.1 milliseconds; a 62.5% reduction in trade clearing time, from 240 minutes to 90 minutes; a 15% improvement in CPU usage. Says Zhaohua: “All of this means we are much better equipped to move at the speed of the market and handle an ever-increasing number of transactions.”

Online Dating: Like many modern applications, online dating services require rapid response times and the ability to leverage real-time analytics and insights to serve customers. Customer retention is a critical goal of Plenty of Fish, which describes itself as the world’s largest online dating site with a community of more than 90 million registered users.

Performance was becoming an issue with its legacy spinning disk storage. “We couldn’t get enough performance out of disk,” says Owen Morley, Director of Operations. “When you flip over to flash it’s a whole new world. All of a sudden, storage isn’t the issue. When it comes to costs, flash pays for itself. We had an ROI of two months.”

All Flash storage has been a critical factor in company growth. The main database server now handles more than 30,000 batch requests a second, with an average response time from the application perspective of 4 milliseconds. “When you click on a page, we serve that page quickly,” Morley says. “We want to keep you happy, we want to keep you engaged, we want you to find that person you’re meant to be with.”

Conclusion

Whether it’s financial services, healthcare, online dating or any other industry, organizations are leveraging All Flash storage as a critical tool in modernizing their business model, reducing TCO and driving innovation. If you’re not yet on the bandwagon, it’s time to find out what All Flash storage can do for you and what to look for in an All Flash solution.


May 3, 2017  2:48 PM

Maximize your digital transformation initiatives with SAP HANA

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Digital transformation, SAP HANA

Digital transformation is redefining business and IT. By 2020 half of all global companies will see the majority of their business depend on their ability to create digitally enhanced products, services and experiences, according to IDC.[1]

This fundamental shift puts IT at the heart of the business. As we’ve seen in industries such as travel (Airbnb), transportation (Uber) and home entertainment (Netflix), those companies that do the best job of leveraging data across the business can not only gain significant competitive advantage; they can disrupt entire markets.

Enter SAP HANA. More than 345,000 companies worldwide use SAP for their business critical applications such as ERP, CRM, workforce management, financial management and supply chain integration.

The availability of SAP HANA as an in-memory database gives all of these companies an opportunity to drive their next-generation digital transformation initiatives. By integrating transactions with analytics—and leveraging both structured and unstructured data—organizations can use SAP HANA to achieve new levels of customer service, employee productivity, workforce transformation and business innovation.

The issue for many SAP customers is not whether to migrate to SAP HANA, but how to ensure that the migration delivers the most value to the business over the long term. That’s where the infrastructure comes into play. Typically, an organization is adopting SAP HANA to increase the performance of analytics so that ever-growing amounts of data can be used to drive faster, more accurate and more reliable decision-making and insights. So why compromise on performance? In fact, why not deploy solutions that deliver the highest levels of performance so the organization can truly maximize the competitive advantage enabled by SAP HANA?

Driving SAP HANA performance improvements

When SAP HANA first became available, customers had little choice in technologies to maximize infrastructure performance. There were commodity x86 servers and not much else. But that is no longer the case.

SAP HANA customers can now utilize much more powerful Linux-based servers that not only significantly increase performance, but also reduce maintenance costs, increase capacity, enhance agility, improve price-performance and enable bullet-proof reliability and resiliency.

In addition, with Power8 servers from IBM, organizations can now enable virtualized instances of SAP HANA in production environments, with an 80% system utilization guarantee for enterprise-class systems.

These are pretty important benefits when you think about digital transformation and the impact it can have on your business. For example, built-in virtualization greatly improves resiliency and reliability, thereby mitigating the risk of downtime. When the entire business is built on digital technologies, downtime can have a huge impact on profitability, reputation and goodwill. Performance is also vital, where every nanosecond in delivering advanced analytics can translate into increased sales and greater customer satisfaction.

Investing in the future of your business

If you are investing in SAP HANA, you are investing in the future of your business. You are also betting on your ability to drive innovation, productivity, profitability and customer care through advanced analytics and real-time insights.

By choosing infrastructure that maximizes the crucial areas of performance, reliability and agility, you have an opportunity to gain competitive advantage—against companies that are not using SAP HANA at all, as well as competitors that rely on commodity servers and appliances. So the question remains: Why wouldn’t you do everything you can to maximize the value of your investment in SAP HANA?

[1]IDC Sees the Dawn of the DX Economy and the Rise of the Digital-Native Enterprise,” IDC, Nov. 1, 2016


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: