when relevant content is
added and updated.
when relevant content is
added and updated.
By Greg Lord (@GregLord11)
Businesses are focused on pursuing the holy grail of higher revenue and lower costs, and as they evaluate new technology solutions to help drive this growth, CIOs are changing the way they deploy and manage business critical applications by striving to leverage the ubiquity and cost efficiencies of the Internet for application delivery. Although every organization’s IT strategy and approach to application delivery varies, the common requirement across all organizations is that end-users need fast, reliable and secure access to all their business applications. This requirement has become increasingly challenging given the complexity of application distribution across multiple data centers, end-users located all over the world using various devices and a growing list of business applications such as customer relationship management (CRM), collaboration, product lifecycle management (PLM), and support portals that users rely on every day.
For CIOs to successfully deliver applications to end-users within an organization, they need to understand the challenges in managing the Internet connection between an end-user’s device and the data center where a particular application is hosted – specifically on at the enterprise-level. The Internet was not designed to handle the demands and requirements of business use given the legacy architecture and logic of the Internet, the selection of routes between end-users and data centers is extremely inefficient. Once selected, the transmission of data along a route is slow and error prone. The Internet itself, and large Internet-connected cloud data centers, are prone to over-congestion and downtime. In addition, mobile devices have different operating systems, browsers and connection types that introduce complexities. The Internet offers no inherent web security protection and it can be very difficult to gain visibility or manage and control applications being delivered over the Internet.
These challenges are what we call “The Enterprise Internet Problem,” which can result in lost revenue due to partner and customer frustration with the poor response times and spotty availability. There are negative impacts on end-user productivity due to long load times as well as data loss vulnerabilities. Frustrated IT organizations struggle to troubleshoot issues and support complex application delivery architectures, let alone find the time to try to optimize the end-user experience.
To begin addressing the Enterprise Internet Problem, organizations typically try one of the following two approaches:
1. Implement a solution that lives within the four walls of the data center – either a physical hardware box or a virtual appliance.
The data center could be an organization’s own data center or the data center of their cloud or hosting provider. Any way you slice it, this approach doesn’t work because organizations need a symmetrical solution that addresses both ends of the application delivery and IT organizations can’t possibly implement a box or virtual appliance in every data center and every end-user location. Also, this approach introduces additional cost and complexity, because organizations need to purchase, implement, and support these solutions – a challenge which is compounded as applications inevitably move across datacenters and cloud environments over time.
2. Continue to invest in maintaining private network infrastructure.
This approach works to a certain extent, in that it helps address Internet performance and reliability issues, but it doesn’t scale because it limits access to applications and restricts organizations from leveraging the cost efficiencies and ubiquity of the Internet.
In order to solve the “The Enterprise Internet Problem,” organizations need to look at various options, including a possible movement to the cloud. Instead of requiring IT organizations to take on the burden of deploying and managing these critical capabilities on their own, cloud-based platforms can help enterprises determine the most optimal Internet route, connection offload, load balancing, real-time failover, web acceleration, Front-End-Optimization, DDoS mitigation and Web Application Firewalls that are not constrained within the four walls of a few data centers. Deploying applications in servers and networks can be effective, as it brings end users closer to the applications needed to operate a business.
By understanding and addressing these problems, organizations can position themselves to instantly enter new markets, improve customer interactions, do business via lower-cost online channels, enable end-users to get more done in less time, and realize the holy grail of higher revenue and lower costs.
Greg Lord is the Sr. Product Marketing Manager responsible for Enterprise Solutions, including Enterprise Application Delivery and Cloud Solutions, at Akamai Technologies. Before joining Akamai, Greg held several Enterprise Sales and Marketing roles at Intel Corporation, including having led Cloud & Data Center Marketing for Intel’s Americas business. Prior to Intel, Greg was an IT Manager at both Reebok and Partners Healthcare. Greg is a certified Project Manager (PMP), has an undergraduate degree in Computer Information Systems from Bentley University, and his MBA from the University of Notre Dame.