I’ve been a writer and editor in the enterprise networking market for 11 years now, and one topic I dread being asked about is network management. I didn’t really understand it when I started learning about networking, and I don’t really understand it now. But the more I learn about the topic, the better I feel about that; because I meet few people who understand it much better than I do.
The problem is not network management itself; the idea is straightforward and is of obvious importance, especially in companies where the network has a big influence on business success (which is most companies, these days). My challenge has been in attempting to understand and categorize a wide range of products that simply defy categorization.
When I ask readers about network management, they often define it according to the particular tool they use, which can range from a basic protocol analyzer to a full-blown suite like HP OpenView. Within that range there are product groups that seem like they should be comparable, but features and functionality vary wildly. One network monitoring tool might show a schematic of your network and alert you if a device fails or traffic levels reach a predetermined unhealthy level. Another network monitoring tool is able to give detailed usage data for every device and performance metrics for each application running on the LAN and WAN, all presented in easy-to-read charts that compare actual results to past performance and service-level guarantees.
This makes it difficult for network managers to choose appropriate products. Muddying the waters even more is the recent overlap between network and applications management, as well as security management and monitoring. Many products incorporate applications and security elements, making it even harder to compare them. Add in that enterprises typically use several network management products in a layered fashion, all connected to element managers for different devices, and you have a complex scenario.
How can you make sense of network management and use it to its fullest advantage? One way is to have a clear game plan. The report, Network management systems: The good, the bad, and the ugly, brought to you by our partnership with Info-Tech Research Group, provides a step-by-step methodology for evaluating your network management environment. The resulting analysis can help you identify where your existing products are already managing the network well, how they work together, and where you should be making investments in additional products or upgrades. Check out the report and let us know if it helps make your network management look any less ugly.
Last night I was watching Attack of the Show again and they mentioned a site called BuzzFeed, which picks up the latest and greatest Web trends and conversations and compiles them all with an editorial perspective. One of the trends they had picked up on of late was the rise of “anti-social networking” sites, which are pretty much what you’d think.
Enough people have gotten fed up with Friendster and LinkedIn (and hearing how social networking is the new black) and have formed parodic sites like Snubster, EnemyBook, and isolatr… designed, in the words of EnemyBook, to “disconnect you to the so-called friends around you.”
I suppose social networking was overdue for a backlash, although you probably don’t need me to point out that there’s an irony in connecting to disconnect. Especially if you believe that parody is the sincerest form of flattery (in the world of social media, in particular), then you have to think that the authors and users of these sites are really more interested in promoting an agenda, a sneering iconoclastic stance that nonetheless snuggles right up next to that which it seeks to mock — like Sid Vicious spitting out the lyrics to “My Way” — the essence of punk rock, but knowing all the while that for all his swagger, he was really a pretty boy in an extremely well-marketed boy band
Anyway, if you really don’t want to connect to other people, why not just go to the library and bury your nose in a book?
The other interesting think about anti-social networking sites, as pointed out in naturalsearchblog, is their effect on search engine optimization efforts. It undermines your efforts to associate with the “right” keywords if you go and list a bunch of stuff that you hate. (Maybe that’s why Hatebook makes their content pages uncrawlable.) I thought of this same problem once when I was listing all the things I disliked on my personal blog profile… I didn’t want the system to connect me with other people who listed “Thomas Kinkade” or “beer pong” just because I said I didn’t like those things. So I scrapped the list.
Of course, maybe that’s the point of anti-social networking… I can make un-connections with other cranky people who hate the same things I hate. Is it easier to relate to someone about your dislikes than your likes? Hmmm… well, speak ill of Steve Ballmer on any Linux forum and you’ll probably find a friend for life.
Tell me you haven’t heard this before: Getting a certification earns you more pay. This week, however, Foote Partners LLC released a study revealing just the reverse: the average premium pay for uncertified workers trumps those who are certified. They have seen the average premium pay for uncertified workers increase 8% and decrease 2.3% for certified engineers in the past year.
I sat down with cofounder and CEO of Foote Partners LLC’s David Foote a few weeks ago to discuss the value of networking certifications in the job market. At that point in time, statistics for non-certified versus certified IT worker base pay percentage had just come to a head. Foote saw that managers were just beginning to look more at the skills IT professionals had to offer over the certifications they had obtained. In light of these new statistics, we’re really seeing results of this statement now. In less than a month we’ve seen a huge difference in how certifications are viewed.
“Part of the reason for this is the steady convergence of IT and business as, quite clearly, the design and delivery of products and services is heavily enabled by technology,” Foote stated in his report. New technology has always lead to new specialized jobs (think IT engineers). With the rapid rate the tech industry evolves at, it should come as no surprise that the IT job market would follow suit. To paraphrase Foote’s words, the toll that skill certifications are taking is just a drop in the bucket of changes we will see in the industry as a whole.
“IT professionals today have to be routinely knowledgeable about a whole lot of things that have to do with their employers’ industry, customers, and products–enough to take a strategic as well as tactical role in growing the business,” Foote said. But is it possible for IT to be completely converged in the business? Foote Partners LLC found that managers are most likely to hire IT pros with superior business skills, over IT pros with superior tech skills.
Does a decrease in pay for certified engineers mean that the workforce will start to see less-knowledgeable workers? IT guys and gals constantly juggle between certifying, schooling and getting work experience, and it finally seems that work experience is more valuable to managers. We can hopefully see experienced IT workers getting the recognition they deserve. No more struggling to prove their skills on paper; no more sacrificing work experience to chase down a certification.
Will this mean certifications will no longer exist? Should you request a refund of your Cisco training camp check? Not entirely. For one, Foote says “The Department of Defense has made [the decision to make]* certification a condition of employment,” meaning, once this goes through, The Department of Defense Directive 8570 will only hire security workers that have security certifications. So if you plan on helping Uncle Sam’s security sector five years from now, these stats won’t mean a great deal.
What else should you expect to see? Foote Partners LLC says that “the IT career ladder has been replaced by meandering career paths that span business functions and enterprises.” So rather than a rigid, one-way climb, think more about moving up, diagonally and across a jungle-gym rope ladder.
*The Department of Defense Directive 8570 is not mandating certifications for another five years.
Back in July, I wrote in a SearchNetworking newsletter about the challenge of choosing the right network management tool:
There’s not only overlap among these tools, but also a lot of variation in what they aim to manage — so much so that an apples-to-apples comparison is almost impossible to make. So far, it’s up to the people using the tools to clear up the confusion and determine the best approach.
I then asked readers to send me their thoughts, and received a very insightful reply from Internet security professional Rob Newby:
Network Monitoring and Management is a space which has been booming in recent years. A number of tools have grown up to monitor jflow, netflow, cflow, etc. There are more SIM, SIEM, and Log Management tools than I care to think of at present, most of them starting with “Net” something or “Log” something.
However, like all the simplest questions, “Why is the sky blue?”, “Why is there thunder and lightning?”, etc., the answer is longwinded and complex, and not as simple as this.
I have worked as an SE, and lately Product Manager for various companies, selling IT security tools, network add-ons, devices, software and hardware. For as long as I can remember, people have asked for centralized management and simple monitoring. The problem, of course, comes from the fact there there are no standards for these security devices and tools, apart from weak protocols such as SNMP and syslog, which are not up to the task of controlling and watching a network of hundreds of nodes.
To prove the lack of alternatives, HP Openview, an SNMP tool which gives a picture of network health by picking up SNMP traps, is still as popular today as it ever was. Nagios, and open source version is still used in many enterprise environments. Syslog collectors are available for all the “Net” and “Log” devices mentioned above.
Because of the lack of standardization, centralization is increasingly difficult unless you have some sort of vendor tie-in. Microsoft’s Operations Manager (MOM) is looking to be the most likely candidate for popular centralized management as the market crawls forwards to its decision. At the moment, it is wide open, however… and vendors are also moving towards SOA type offerings which can interact without the need for building APIs.
The market itself is crowded and becoming more complex. It is hard to make progress in any of these areas, and those leading the standards are the ones who will inevitably make the best of the markets. A common event format is being proposed for Log Management devices, for example. If you can’t standardize the software everyone is running, standardize the output — it makes sense.
Last Friday, G4TV’s Attack of the Show named the wireless Belkin N1 Vision the “most awesome router ever.” Co-hosts Kevin Pereira and Olivia Munn rattled off a list of winning specs including 802.11n (although it runs in “mixed” mode), multiple SSIDs, a “guest-enable” feature and easy installation without CDs or software (you just turn it on and your computer detects it automatically).
The obvious thing that’s most impressive about the N1, however, is its sleek design and LCD display, which tells you a lot more about your connectivity status than the traditional blinking lights. The N1 is getting a lot of press for its sexy appearance; it looks more like an iPhone than your typical Linksys box.
So is this just for home users with gadget lust? Would you shell out $200 for this sleek 802.11n router — or would you prefer something that looks more like, well, a router? (Hey, we thought routers were sexy all along.)
Recently, Tom Keating at TMC wrote in his blog about Digium acquiring Switchvox, a proprietary Asterisk-based VoIP solution. Keating had interviewed Asterisk inventor and Digium CTO, Mark Spencer, who explained that one of the goals of that acquisition was to assimilate some of the proprietary technologies from Switchvox back into open source. Keating remarked:
I recalled Mark Spencer’s IT Expo keynote where he espoused the benefits of truly 100% open source solutions and how this contrasted sharply with some of Digium’s competitors such as Fonality. Again, Mark called hybrid-open/proprietary solutions “evil.” I couldn’t help but think of Digium vs. Fonality as Superman vs. Bizarro. Who is Bizarro and who is Superman I leave for you to decide…
One month ago, at TMC’s Internet Telephony Conference & Expo in Los Angeles, I had the pleasure of meeting Mark Spencer. Unfortunately, I had missed the relevant part of Mark’s keynote, which (fortunately) Greg Galitzine summed up nicely in his blog:
[Spencer] mentioned the evolution in the open source world, where things have gone from a simple “good versus evil” debate (open source vs. proprietary) to a complicated new world where we find open source (good) fake open source (bad) proprietary open source (evil), and even proprietary hybrid hosted (really evil).
Naturally, being drawn to all things evil, the next day at the conference I met with the dark lord of the open source PBX world, Chris Lyman. OK, Chris isn’t actually evil (to my knowledge) — he’s the CEO of Fonality. Fonality, one of the “proprietary hybrid hosted” companies makes low-cost phone systems — according to Lyman, their Trixbox platform is the first and biggest built on top of Asterisk code.
|Chris Lyman shows off the trixbox appliance with a tricked-out case mod.(And it’s green, too…)
According to Fonality’s website, being a hybrid-hosted phone system means that the free software is first downloaded by a business and installed on a local computer and local IP phones. After this step, the local computer connects to the Fonality network where server health, call quality and usage are constantly monitored. The hybrid-hosted nature of trixbox Pro also securely extends the phone system outside the corporate firewall, so an employee’s extension can follow them when they work from home, remotely on a laptop, or even on a mobile phone. (Thanks to Alicia diVittorio for tracking down the definition.)
Sounds like a pretty good arrangement, especially for SMBs who may not have the resources on staff to manage a premise-based solution. In a tip on SearchVoIP.com, Yankee Group senior vice president Zeus Kerravala advised companies:
Even if you’re a predominantly do-it-yourself IT organization, consider a hybrid environment where the hosted services are used for some of the smaller branches and telecommuters. This will probably scale much more easily for you as you move more locations over to VoIP.
What I’m trying to figure out here is why the proprietary hybrid hosted model is “really evil,” and I can’t seem to track down Mark Spencer to comment. (Hey Mark, if this is a secret alter ego thing, I’d be just as happy to talk with Superman!) Personally, I’ve been a proponent of open source for some time now. I understand the benefits of encouraging innovation and avoiding vendor lock-in and lowering per-port charges. So, is Fonality locking in their customers? Are they corrupting open source ideals? Will they stifle innovation? Will mixing open source and proprietary code cause an explosive reaction, like encasing Kryptonian crystal in green kryptonite?
And, despite all this, “Most of the people buying the phone systems in the SMB space don’t even care if it’s open source,” Chad Agate, Co-Founder & CEO of NeoPhonetics, a Digium partner, said during a conference session on “Selecting an open source VoIP solution for the SMB.”
When I asked Lyman about the whole “very evil” issue, he pointed out that Digium has their own tricky issues and aren’t as pure as the driven snow when it comes to open source and their “Digium Waiver” (Read xrobau’s blog on Fonality and the GPL for more about that.)
“This shouldn’t be a war of the Davids; the Davids should join forces and take on Goliath, which is Cisco,” Lyman said. “Fonality and Digium don’t equal one half of one percent of U.S. PBX market share.”
Related stories from TechTarget:
Around this time last year, network access control (NAC) was the be-all, end-all for network security. Performing pre- and post-admission checks on devices before allowing them access to the network and applications was still a relatively fresh concept.
And, as with every new thing, vendors scrambled and clawed to get their solutions to market and offer a new or different form of NAC, adding in one or two new components, but keeping the rest pretty much status quo.
Now, however, it seems it’s all been done. While many key vendors offer some form of NAC — Cisco, Microsoft, Juniper and others — it’s getting increasingly harder to differentiate between them, since NAC has entered the realm of commoditization. There are also still a number of vendors — Vernier, Nevis and many more — offering point-based NAC appliances and tools to fill the gap, but even those solutions vary in only minuscule ways.
I didn’t really see things that way until a recent chat with Current Analysis senior analyst Andrew Braunberg. While we discussed some additions and enhancements to Juniper’s Unified Access Control (UAC) NAC products, Braunberg quickly pointed out that NAC has gotten to the point where there isn’t much that can be added to it that isn’t already there. Sure, vendors can enhance certain elements and integrate NAC with other tools, but the core functionality of a NAC solution is likely not to change much for a while.
“There’s not really going to be anything new under the sun in the NAC market over the next few years,” he said. “Most of it is already available. Vendors will continue fortifying their NAC solutions.”
I have to agree. It seems the time for radical developments in NAC has stopped. That’s not necessarily a good thing or a bad thing. It just is. I’m curious, however, what that next big NAC development will be a few years from now. I’d like to ask you. Do you have any predictions on where NAC is heading? Do you agree or disagree that NAC solutions have reached a plateau? How will that affect your NAC purchases moving forward?
For about as long as routers, switches and cables have been connecting critical functions of the business network, people have been using that same technology to connect socially — whether for sharing notes on open source tool configuration, trading MP3s of their favorite bands, or swapping recipes for SPAM Fra Diavolo.
While the former may have its place in the business world, the latter can drain productivity and bandwidth if employees are “networking” on the clock. But this week, SearchCIO news writer Shamus McGillicuddy reported on a recent study that concluded fewer than half of IT managers polled banned employee use of sites such as MySpace and Facebook.
What’s interesting is how quickly many businesses have not only shrugged their shoulders at this phenomenon, but have actually jumped on the social networking bandwagon. For example, a year ago, McGillicuddy reported how IT execs were willing to exploit the Web 2.0 wave by getting into blogging and other online community-building activities. And in September, blogger Jeff Kelly wrote:
The social networking application market will grow to over $420 million dollars by 2009, a whopping 815% increase from its 2006 size of $46.8 million, according to a recent report by IDC. As the market develops, the report continues, social networking functionality will increasingly be built directly into the foundations of communication platforms including email and IM. (Batten down the hatches: Enterprise social networking market set to explode!)
So what does this mean for networking pros, aside from more traffic to support on the IP network? First, I wonder whether the benefits of shared information will ever be enough to warrant taking time out of a schedule already overwhelmed with building, maintaining and troubleshooting the network, dealing with end users, and putting out fires. I also wonder whether online social networking will ever replace the much more chaotic and interesting real-world networking on the trade show floor or at the vendor-sponsored cocktail party, where you can raise a glass to your favorite router along with a new-found friend who shares your taste in skewered mini-meats.
In reality, for regular companies (not media conglomerates and hip online startups) it may just be “blogging and other online community-building activities” that catch on for the meantime, if any. I’m the first to admit that I’ve been skeptical of the value of blogs, especially the sort of “here’s where my cat threw up today” blogs that clog so much of cyberspace. But one quality I do find potentially valuable is the egalitarian spirit blogging encourages. Publishing is no longer kept to the domain of the gatekeepers; virtually anyone can go online and post their ideas. Community is built grassroots style, from the ground up, rather than dictated from above. Also, where online media is concerned, the reader can now participate in a conversation — as Paul Gillin pointed out in his blog essay “The new journalism.” Social networking (or “social media,” to use Gillin’s terminology) builds community and opens a dialogue between its participants.
That’s the reason why SearchNetworking.com launched The Network Hub as part of the IT Knowledge Exchange. In the ITKE, IT professionals can gather, share problems and solutions, and create a knowledgebase of useful information. Moreover, ITKE is a community of IT professionals — segmented by the interests you select when you register. We hope this distinguishes the ITKE from other, generically social sites. You can ask very specific questions, directed at other qualified people who can collaboratively build answers to your questions. This should save you time in your network troubleshooting and other job functions — a far cry from wasting your day downloading 8-second “Flock of Seagulls” samples (although we don’t necessarily condemn this practice). You can also read IT blogs, like The Network Hub, and post comments — or even start your own blog.
We believe our readers have plenty of expertise in networking (the routers and switches kind) and we want to help you share that knowledge with us and each other, using the “other” kind of networking.
Day two of setting up the network for the World Cyber Games has seen the network take shape.
After a quick trip to Sears to grab some extra tools — a wrench and a screwdriver set — the 25 switches have been configured and assigned names and IP addresses, which will allow the team from ProCurve Networking by HP to manage them centrally. The more than 700 PCs have been fired up and assigned to their switches, making each gaming station its own miniature network. Ben Van Kerkwyk, the lead engineer, said each switch will provide gamers with 1 gig speeds and localizing the network to each gaming table cuts down on hops, which could degrade performance, which in a gaming environment could be disastrous.
Crews laid down more cable, creating a maze of multi-colored wires streaming to and from all of the ports. Once cabling is complete, the network will be segmented into separate VLANs and subnets, making it easier to manage, Van Kerkwyk said. The ProCurve team will also make some sections of the gaming area wireless for VIPs and admins.
There is also an overflow of extras — two more core switches arrived today, and there are extra 2650 switches on hand in case something goes wrong.
“We have three core switches and 15 power supplies, so we’re good if something goes down,” Van Kerkwyk said.
Here’s a draft of what the network will look like upon completion:
And here’s ProCurve technical consultant Chris Ruybal rack-mounting the 8212 core swtich:
The World Cyber Games Grand Final is gearing up to be the gaming event of the year, with gamers from all over the globe gathering at Seattle’s Qwest Field Event Center to battle each other in FIFA ’07, Counter Strike, StarCraft and others…sorry, no Ms. Pac Man or Donkey Kong for us old-schoolers. Picture the Super Bowl of the gaming world, a contemporary version of the events documented in The Wizard or King Of King.
But before the game-fest kicks off in earnest on Oct. 4, teams are working feverishly to set up the massive network to ensure the games go off without interruption and without a hitch. Imagine the network dropping out the final lap of Need For Speed Carbon?
In less than 48 hours, a team from ProCurve Networking by HP — with help from several other groups along the way — will assemble an enterprise-grade network powering roughly 1,000 network devices, more than 700 of which are gaming systems. A tall order for a temporary network.
Today was all about setting the stage. The ProCurve team used diagrams to plot the layout. It spent some time tracking down an elusive ProCurve 8212zl core switch (which was in the building, but nowhere to be found … despite the gaming festival being the 8212’s first public appearance). The rest of the day was spent powering up and troubleshooting more than 20 ProCurve Switch 2650s and mounting the found 8212. Elsewhere, teams laid cable, set up PCs and ensured things were good to go. And that was just in the balcony.
But, alas, progress was cut short on Thursday and the main event floor was off limits because 80s rocker Bryan Adams is performing at the venue Friday night. His sound check was set to begin at 4 p.m. Thursday, meaning all set up was suspended until Friday morning.
“That’s one of the challenges for setting up for an event like this,” said ProCurve Technical Consultant for the Americas Chris Ruybal. “The on again and off again.”
I guess Adams was right when he crooned, “It cuts like a knife.”