While for a while it looked like the 4G wireless market was going to go all WiMAX all the time because it had a market lead. But LTE technical deployments are gaining serious momentum among wireless operators, according to Infonetics Research’s second edition of its biannual LTE Infrastructure and Subscribers market size and forecast report. And as wireless operators go, so there will be a ripple effect on wireless enterprise WAN options.
Mobile operators are beginning to build out their 4G wireless infrastructure and as of the most recent count, commercial LTE launches for 2010 will number 14 (include Verizon and Japan’s DoCoMo in that group). Infonetics principal analyst of mobile and FMC infrastructure Stephane Teral says macrocells are where the main action is, and will be followed by LTE-based services, which is where the importance of the evolved packet core comes in.
Teral notes that the chief drivers behind LTE becoming the future-proof mobile broadband platform are its peak rates, latency and spectral efficiency. LTE speed? Tests have clocked in 100 Mbps upload and download speeds, but conservative numbers may be closer to 20 Mbps.
As noted in the Infonetics report, Teral projects that based on public announcements made by service providers planning LTE services, the number of LTE service subscribers is expected to exceed 72 million by 2013.
But don’t expect LTE phones for the holidays. The report says that for the first five years of deployment, LTE will be predominantly “PC-based” (laptops, netbooks, dongles, etc.), with LTE smartphones expected to hit the market after 2011, which is when 4G LTE voice, data and multimedia can all mingle freely.
That same line could also be applied to wide area networks today, as more applications reside on the Web, software as a service (SaaS) activities increase and companies make more use of virtualized environments. We’re going to need a bigger boat, or, more accurately, a faster and more robust network to handle the demands that are coming in terms of new applications and computing environments.
This became very clear at MIT’s Emerging Technology Conference (EmTech) this week that showcased some of the more innovative applications under development, selected by the staff of Technology Review magazine, and provided an opportunity for companies like Mozilla to talk about upcoming developments.
One of the more interesting applications profiled at the event is a computer game, called Foldit, that enlists tens of thousands of players worldwide to engage in ”citizen programming” by determining the best structure for complex proteins – an exercise termed protein folding. By relying on more than 90,000 players, scientists can not only eliminate the need for massive numbers of dedicated servers, but can also hopefully unravel the complexities of things like the HIV virus.
The effort is supported by companies like Adobe and Microsoft and by the Defense Advanced Research Projects Agency (DARPA) – which makes me think there may be a more sinister side to protein folding than what is being presented.
One other project, called BitBlaze, comes out of the University of California, Berkeley, where they are working on a new way to combat computer viruses and worms by taking the same approach as fighting a flu virus. Just like flu viruses, evolving generations of computer bugs can morph to keep one step ahead of traditional checks and scans. The software under development targets the variant stalks to keep pace with these changes and prevent the spread of a computer virus across networks.
A new face for Firefox
Mozilla’s chief evangelist and head of developer relations Chris Blizzard also took the EmTech stage to talk about upcoming plans for the Firefox browser, which have been under development for more than a decade and bouncing around as an Internet Explorer alternative for about five years. Firefox presently has about 300,000 users worldwide and is adding one million users per week, said Blizzard. However, it is still a distant second to Microsoft’s IE6, which debuted in 2001 and is the most popular browsing platform in the world, even if it is showing its age.
The next version of Firefox will include accelerating 3-D rendering on the Web, which is perfect for advanced medical imaging applications. Subsequent versions will incorporate facial recognition technology, to automatically identify users, and then mix that with technology that will automatically associate users with their email, online chats, search histories and any other footprint left on the Web. This has huge implications in terms of security, retail, social computing, business intelligence and so on.
Dr. Ronald Dixon, who heads up the Virtual Practice Project at Massachusetts General Hospital in Boston, is looking to apply social computing and collaborative technologies to patient care by developing techniques that extend the doctor-patient relationship. He has used email, Skype, video conferencing and other technologies to interact with patients who may have moved beyond the boundaries of a hospital and insured coverage. He is presently coordinating trials of medical kiosks in Singapore and the U.K. that might one day be used by people in nursing homes, assisted care facilities, hospices and other places that right now are on the edge of healthcare.
If “information is the currency of democracy,” says Dr. Dixon, quoting Thomas Jefferson, then “relationships are the currency of healthcare.”
Unfortunately, “what is best for the patient may not be best for the system in terms of making money,” Dixon said, noting that his email medicine and even the foreign trials of medical kiosks are viewed by the bean counters as nonessential since they do not directly drive revenue.
All of these technologies and many more profiled at EmTech require two things: Funding, to get them off the ground, and a reliable and effective network to support different types of data and the users who are pushing and pulling information. This is where companies like Riverbed, Silver Peak, Blue Coat, EMC, Brocade and even Google come in, especially as they work to develop new wide area and local area network schemes and change the rules of Web prioritization and access.
In order to succeed, journalism has to be a two-way street. Reporters talk to people, these people provide the framework for an article, and then people respond to the article through letters, email and op-ed articles.
Service journalism – the kind that populates the virtual pages of SearchenterpriseWAN.com – is even more dependent on reader interaction since it is more focused on news you can use rather than the traditional who did what to whom and why approach. The readership is obviously more technical and more interested in news and tips that can be utilized in their day-to-day jobs, and more willing to share observations from the WAN front lines with other readers.
The following is an edited sampling of some of the letters and comments we received from the SearchenterpriseWAN.com readership over the past several weeks via email, the social networking and, yes, even through old school mail routes.
On a recent series of articles and a video report focusing on the move toward IPv6 and the impending IP addressing crisis:
I believe that moving to ipv6 is almost impossible now. As far as i can see all relevant network equipment was already adopted few years ago, but applications are still coded (many of them badly) for v4. More important, it scares many network administrators. So, unless something very unexpected happens, we are going to stay with the v4 infrastructure for a very long time.
– G. Michaelov, Technical account manager, Aman Computers
I’m not seeing a massive shift toward IPv6. Because most enterprises NAT a small set of public IP addresses to a large set of private addresses, the urgency of moving off IPv4 is just not there.
- T. Yohe, VP engineering, Stampede Technologies
In response to a LinkedIN query on improvements in the WAN jobs market and related article an apparent thaw in WAN employment opportunities:
Interestingly, WAN performance will usually be an issue for companies that have gone through a consolidation process. I’ve always approached WDS / APM as an enabler for companies to deploy other technologies on the WAN, and also as a way to better manage what they already have. So, based on this, even in a recession, these types of technologies should be high on the list of ‘things to address’ in any organization that’s connected with some geographical distance between sites.
Interestingly on looking around ‘personally’ for opportunities in this area, most if not all of the companies I spoke to (Vendor and SI), were looking to address recruitment again in Q4 09. Some of the SP’s I’ve engaged with are also actively building an ‘application aware’ service, which should generate some interest from the bandwidth provider side.
- A. Ford, proposition manager, Telindus
In response to a series discussing key points in selecting, deploying and ‘future-proofing’ your WAN solution, as well as a look at FTP file transfer alternatives:
I just want to expand on the point about carrier/cloud/managed deployments as we are seeing an increased demand in this area from business to service smaller offices. It is really difficult for companies to justify spending thousands of dollars at each small office (typically using DSL) for WAN optimization gear. The compelling value of carrier deployments is in the centralization of gear in POPs around the globe so one can take advantage of economies of scale in being able to share the WAN optimization equipment investment across a number of remote locations.
While there are trade offs in that you cannot take advantage of compressing that local circuit to save on bandwidth spend, end users can still benefit from increased application performance across the WAN. Hybrid solutions are a great way to optimize the cost/benefit trade off. Use CPE where the bandwidth savings due to compression can be significant while using a network approach to optimize traffic for smaller offices where CPE expense cannot be justified.
- K. Lynch, product manager, Virtela
We’ve been asking a lot of questions about network security lately, specifically targeting those companies that are getting more involved in such things as cloud computing, software as a service (SaaS), managed services, and other areas where applications and data are more virtual and bounce around a network like digital nomads.
Most of the vendors we talk to, including those involved in very heavyweight and mission-critical applications like CRM and ERP, insist that current Internet security safeguards and firewall filters are enough to keep everything safe and secure on your WAN, LAN, PAN and so on. A representative for a very big and very well-known enterprise software company also said – quite blithely – that security is the user’s problem and not a big blip on their radar.
This attitude is surprising, given the fact that companies are expected to spend more on security software and services next year, even as the budgets for other infrastructure segments are declining as a result of the weakened economy.
Spending on security software and services is expected to outpace that for general IT, according to market researcher Gartner Inc. Software security spending is expected to grow by approximately 4% in 2010, while spending on security services is projected to grow almost 3%, Gartner reports from a survey of more than 1,000 IT professionals with worldwide budget responsibilities.
The uptick in security spending is in part being driven by a shift toward managed security services, cloud-based email/Web-security solutions, and third-party compliance-related consulting and vulnerability audits and scans, Gartner points out.
Companies looking to validate a higher budget for security spending probably don’t have to look any further than the firms in their own geographic and industry-segment backyard. In its comprehensive Data Breach Investigations Report, Verizon Business documented 90 confirmed security breaches within the businesses that employ its services, totaling roughly 285 million compromised records. Roughly 74% of these breaches came from outside sources and 20% from insiders.
The industries hardest hit by security problems include retail (31%), financial (30%), and food and beverage (14%).
“Businesses should also recognize that new threats or vulnerabilities may require security spending that exceeds the amounts allocated and should consider setting aside up to 15% of the IT security budget to address the potential risks and impact of such unforeseen issues,” said Ruggero Contu, principal research analyst at Gartner.
We couldn’t agree more, especially as more efforts are made to speed things up and boost performance in the WAN, and companies rely more on cloud-based and managed services — all areas that we will be covering more as we move toward the final quarter of 2009 and look for ways to approach networking in a more strategic and security-minded fashion in 2010.
** Gartner’s report, Security Software and Services Spending Will Outpace Other IT Spending Areas in 2010, is available for a fee on the company’s website.
If former U.S. vice president Al Gore noticed his mailbox was a bit fuller than usual, he shouldn’t be too surprised since this week marked the 40th anniversary of the birth of the Internet.
About a decade ago, if you recall, Mr. Gore reportedly took some credit for inventing the Internet, or so the press reported and later George Bush sarcastically promoted during his campaign for U.S. president. In reality, Gore never did say he invented the Internet, but did maintain he had some influence in its growth when as senator he promoted the use of the Internet and supported its development. In any case, we at least think Mr. Gore is deserving of a few birthday cards and perhaps a small piece of IP cake.
Looking back over the years, the Internet has evolved from a klugey and clunky messaging and file sharing pipeline into being the networking backbone for most of what is happening and will continue to happen in enterprise computing today. The evolution of the Internet has sparked a major revolution in computing, creating new business models in collaborative messaging, software as a service (SaaS), cloud computing, unified communications, and managed network services.
As a result of the increased activity on the Internet, networking architectures are changing dramatically resulting in more robust wired and wireless structures and more capable and higher-performance WANs.
But, like everything in life, the longest and most remarkable journeys begin with the first step, and the development and launch of the Internet is no exception. Most people agree (including, we are sure, Al Gore) the Internet was born on Sept. 2, 1969 when two computers at the University of California, Los Angeles exchanged small snippets of meaningless data in a first test of the Arpanet, an experimental military network. The first connection between two sites happened almost two months later when the computers at UCLA “talked” with those at Stanford Research Institute in Menlo Park, CA (although the network crashed after the entering the first two letters of the word “logon”).
Subsequent key events over the years included:
- The development of TCP by Vint Cerf and Robert Kahn in 1974, allowing multiple networks to communicate;
- Creation of the domain naming system in 1983, bringing to life such now common appendages as .com, .gov, and .net.
– The creation of the World Wide Web by Tim Berners-Lee in 1990, first developed to remotely control computers at CERN;
- The development of the Mosaic Web browser by Marc Andreessen and colleagues at University of Illinois in 1993, the first Internet platform to combine graphics and text on a single page.
More important dates in the evolution of the Internet, from its birth to current state, are available in an Associated Press dispatch on Google News.:
Happy birthday, big guy, and best wishes for many more to come!
This has been a cloudy summer for most vendors in the WAN optimization and storage business, although it has absolutely nothing to do with the weather.
An increasing number of companies are relying on Web-based applications and software as a service (SaaS) alternatives — which are the essence of cloud computing — to channel resources to their users over both wired and wireless networks. As a result, the performance and reliability of those networks and point-to-point connections are critical to the success of IT operations – especially when dealing with a widely scattered and remote user base.
As SAP product manager and WAN optimization honcho Jana Richter says, “The more we go into these areas, where the applications are further from the users, the more the issue of accessing and using the applications in an optimally performing manner comes up.“ This is a primary reason why SAP, which is heavily vested in CRM and ERP, has dipped its toes into WAN optimization.
Although the network is important as the data highway, other technology segments have become equally important as indispensable elements in WAN optimization. One of these areas is storage, which explains why such leading players as EMC Corp. are suddenly very focused on cloud computing and such things as WAN acceleration, applications and data prioritization, and business continuity. In fact, the company has partnered for some time with Brocade Communications Systems and Silver Peak Systems toward that end, which from EMC’s viewpoint is the “seamless federation” between internal and external resources, according to an EMC spokesman.
EMC squaring off for net management
EMC is presently working with its network-oriented partners to develop improved management tools that extend across distributed storage systems and the much-hyped network cloud and to deliver more detailed metering of usage patterns. This system, now in public beta, can perhaps be used to charge back users of data resources that flow through various local and wide-area networks or to deliver more management and control to avoid any storms within these applications and data clouds.
The goal, of course, is to parlay storage and WAN capabilities into new business and service models that provide fast and effective content distribution, reliable access to cloud-based applications, and unquestionable backup and redundancy (both for business continuity and compliance). A more street-level objective, however, is to kick the stuffing out of companies like Google and Amazon, which have both carved out a significant business in storage and Web-based access to data.
(Full disclosure: Google is already a very active Silver Peak customer, with more than 100 sites worldwide optimized with the company’s WAN acceleration and prioritization technologies – which leads one to believe there is no such thing as exclusive alliances in networking. It is every WAN for itself.)
Some of the technology drivers that are fueling the nexus of storage and networking include virtualization, data and IT center consolidations, and convergence, according to the experts at Brocade. There are also some technology crossovers: real-time de-duplication of data, which comes from the storage world, plays very well into networking to eliminate delays created by transferring the same information again and again.
In the evolving world of WAN optimization, applications are the new currency in terms of buying into a system that can deliver top performance and is flexible enough to handle changing user demands. Smart network designers will not only look very carefully at the types of applications that zip across a wired and wireless network, but also prioritize them according to their importance and “mission critical” makeup.
This is not always an easy task, especially if you are dealing with extensive networks that reach out to branch offices and remote users, as well as multiple service level agreements (SLAs). It also becomes a daunting task since the natural tendency for users is to rate their most-used applications as the most critical – even though that may not necessarily be the case. This can create problems if an organization is running tens of thousands of applications. The trick is to zero in on the ones that are absolutely necessary, notes consultant Jim Metzler, vice president of Ashton, Metzler & Associates. It is critical to narrow that range, he says.
In planning for new equipment additions or WAN upgrades, it is also makes sense to review the applications bouncing around on a network and select equipment based on the mission critical nature of your most key software.
This is exactly what the IT staff did at Performance Health Technology, Ltd. (PH Tech), a healthcare information services provider that specializes in providing up-to-date information on patient benefits and insurance coverage to doctor and hospital subscribers nationwide. The company previously relied on T1 and a WiMAX connection, but ran into all sorts of reliability issues – some related to weather interference. The solution was to install an Ecessa Corp. PowerLink WAN controller that could quickly route around faulty connection addresses and channels to find a reliable connection.
The idea is to keep the most important applications and data up and running, even when network issues create a roadblock. This goes way beyond basic reliability, of course, since the health and welfare of critical networks have a direct impact on business continuity.
Over the next several weeks, SearchEnterpriseWAN will be looking closely at issues including WAN reliability, fail-safe and contingency planning, specifically focusing on companies that have tackled these issues at the mission critical front lines. These profiles show that pre-planning is crucial to developing and launching an effective network, and absolutely mandatory as companies rely more on unfaltering access to applications and data.
Just like other technology sectors, the WAN optimization market has been battered by the recession and a cautious reluctance by IT types to spend any significant money on network upgrades. Nevertheless, the WAN optimization market has survived and even (relatively) thrived in this tough and bitter economy, with sales hitting about $226 million in Q1 this year.
Things are just as tough outside the U.S., although the picture is a tad brighter in Asia-Pacific countries (including Japan), which had experienced a 21% yearly growth until declining to 13.3% this year. They still chalked up revenues of US$279.6 million, according to our friends at Frost & Sullivan. The market researcher expects growth to rocket to 22.2% next year and hit US$831.6 million by 2015.*
After publishing yesterday’s article on WAN jobs in and out of the U.S., I came across the following chart that had a little bit more info on WAN salary ranges:
The site also breaks down salary by city and state. A WAN engineer in New York City, for example, will make on average $51,135-$80,000, while their counterpart in Austin, Texas will make $45,550-$57,000 … although the latter will also have a substantially lower cost-of-living.
Not surprisingly, California tops the state charts, followed by New York, but a quick browsing reveals there’s opportunity nearly everywhere, depending on what your salary expectations are and where you’re prepared to travel.
One of the most overused clichés in the English language may very well be ‘give the shirt off my back’, as in I would do anything I can to help you out or lend a hand. This hasn’t stopped people from casually throwing it around (like an old shirt), or prevented enterprising companies from bending over backwards (whoops…there’s another) to use it as the centerpiece of a marketing campaign.
This is the case with Uplogix, Inc., a Texas-based provider of automated network management solutions that recently launched a promotion campaign that offers a free t-shirt to promote its technology that reportedly extends network management capabilities to the very edge of your wired and wireless architectures. The solution claims to reduce the cost and complexity of managing networks by offering the ability to automate the process of remote management and recovery even when the network is down. The promotion debuted late last month at Cisco Live! in San Francisco (Uplogix is a Cisco partner).