Enterprise IT Watch Blog


April 15, 2010  6:20 AM

See All, Know All INSIDE Your Network: Security Situational Awareness

Guest Author Profile: Guest Author

Today’s guest post is from Pete Schlampp, vice president of marketing and product management at Solera Networks.

The Identity Theft Resource Center (ITRC), the organization that tracks data breaches, reports 211 data breaches so far in 2010, and 26 of these involve financial services companies. According to ITRC, many incidents actually occurred in 2009 but are just now being brought to light. Waiting weeks and months or longer to discover network breaches hardly seems acceptable. Even worse, the majority of these breaches involve an unknown number of records exposed. Why? Because there is no way to “replay the tape” and see exactly what was stolen or touched. Existing tools only record metadata and signature matches. Without good situational awareness we’re dealing with the equivalent of digital hearsay.

Demands for better situational awareness-knowing and seeing what’s happening inside the network-has led to new technologies and the commercialization of tools that increase the resolution of what can be seen and known by security engineers. Technically, the ability to record network traffic and carve it into perceptible chunks has been around for years. Ask a network troubleshooter about tcpdump and wireshark and they’ll gush like a carpenter over his favorite hammer. Network Forensics companies have taken these technologies and created more robust, accessible, and maintainable tools. At the same time, costs to store and process one hour of GigE network traffic has dropped from tens of thousands of dollars to hundreds in the past five years. The Network Forensics space is rapidly evolving and highly differentiated. Performance, scalability, and the analytical applications available can vary widely.

A recent survey indicates that many network security professionals don’t yet understand the need for Network Forensics and what it can do for them to provide situational awareness. Using security tools based on signatures developed to block known security threats or those based on a collection of metadata spewed off of “dumb” network devices, security engineers aren’t equipped to know even simple details like who is on the network; what applications are being used; and what content is being transferred. This lack of perception forces enterprises and government organizations into reacting to security threats instead of proactively policing their networks and stopping threats before damage can occur. Improved situational awareness can lead to better security and higher resiliency against the backdrop of increasingly advanced and persistent threats. As security engineers become enlightened through situational awareness, they know and see exactly what’s happening on the network and can control it.

Wikipedia defines situational awareness, or SA as “the perception of environmental elements within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future.” In the world of physical security, we think of SA as seeing, hearing, and otherwise sensing the world around us. Major advancements in SA came about with the advent of CCTV and the ability to remotely “see” what was happening, both live and in the past. On the network we’re not responding to incidents in real time because we’re neither able to see them nor have we been able to go back in time and replay events to understand the current situation better.

Without situational awareness, IT security teams respond to incidents in the same way a fire department responds to fires – a bystander calls up to report the problem. By far, the most typical way for an incident on the network to be discovered is by a third party or employee notifying IT that something strange has happened: for instance intellectual property has been found outside the network, a server is running slowly, or a bad actor is bragging about their success. The 2009 Verizon Business Data Breach Investigations Report finds that over 80% of network breaches are discovered either by third parties or by employees going about their regular work activities – not by our existing automated security devices. Because of this, incidents are discovered late, lack data and detail, and lead to higher costs to organizations, industries and individuals.

Network security teams that make the shift to improved situational awareness are empowered with the insight that true security comprises. Stop reacting to security issues and start seeing the problems and knowing what to do about them before an incident can become a network security issue. New tools, particularly network forensics appliances from companies like Solera Networks and our competitors, can reduce the occurrence of network breaches, augment understanding of network alerts and incidents, and enable security teams to recognize exactly what data may have been compromised, so they can proceed consciously and confidently to provide better security.

April 14, 2010  2:19 PM

Building the ultimate network security and troubleshooting utility belt

Michael Morisy Michael Morisy Profile: Michael Morisy

After writing about the importance of network forensics in securing your corporate front lines, I thought it might be helpful to pull together some of the top tools for actually helping protect and maintain your network. Have a suggestion to add to our list? E-mail me at Michael@ITKnowledgeExchange.com or update our community WikiContinued »


April 14, 2010  10:20 AM

Network forensics: Putting the CSI in Cisco

Michael Morisy Michael Morisy Profile: Michael Morisy

Networks are the corporate crime scenes of today. Just ask Google, TJX, or any one of the thousands of companies that have seen their networks turned against them. IT professionals need to step up their game when it comes to dusting for digital prints.

Fortunately, they’ve got a set of tools that (almost) makes CSI look amateur, and some of the best tools have fallen into the domain of networking professionals, according to Gartner’s John Pescatore (bio)

“We have a broader array of tools called data forensics, and one half of that is network forensics and the other half is computer forensics, which you can put on every PC and server. The network products have the major major advantage of it’s very expensive to put software on everybody’s PC and server, and people … can very often disable that software,” he told the IT Watch Blog recently in an interview. “The network tools are more widely used because of those advantages.”

Rather than watching every bit on every computer, network tools watch the choke points: They can see what users are downloading and uploading, e-mailing and IM’ing, and even record all that data for later playback, like a closed circuit television camera or omniscient network DVR.

But just like CSI, today most of the security lapses aren’t discovered until somebody turns up dead or, in corporate terms, the customers start complaining and stuff starts breaking.
Continued »


April 9, 2010  7:00 AM

The network evolution: Virtualization and the cloud force new strategies

Michael Morisy Guest Author Profile: Guest Author

Today’s guest post comes from Rivka Little, site editor of SearchNetworking.com and my former colleague from my day’s in TechTarget’s networking group. I asked her if she’d be willing to write a guest post for this month’s look at all things networking, and she agreed, taking on challenging topic of how networks are going to matter as we enter the age of the cloud, virtualization and other technologies that promise to push IT out of the office. You can read more of Rivka’s reporting and analysis at The Network Hub blog.

The network has been forgotten. At least that seemed to be the case over the last couple of years amid the hubbub surrounding server virtualization and cloud computing.

But stark realities have brought the network back into focus. Server virtualization and cloud computing aim to dynamically deliver applications and data — provisioning and de-provisioning resources on demand. There is no doing that without a new kind of network.

Networking teams are no longer solely responsible for architecting, implementing, securing and managing LANs and WANs. Now they find themselves implementing unified data center fabrics that converge storage and data center networks so that applications can flow freely from its resting state through to the WAN and LAN.

Networking teams also find themselves responsible not only for routing and switching between physical machines, but deep within the server. They are managing traffic both within the server between virtual machines and among physical servers in multiple data centers.

This will eventually lead to the creation of virtualized network components that sit atop of physical switches and routers. Among SearchNetworking readers surveyed in 2009, 40% said managing virtualization would be a top priority for the networking team in 2010.

Networking pros will also use these virtualization management skills in building out cloud computing networks. Network architects find themselves building both private clouds and hybrid clouds that interconnect private data center resources with those in public facilities.

Among SearchNetworking members, 35% say their companies are considering building an internal cloud in 2010 while another 30 percent say their networking resources will be affected by supporting external cloud services.

The shift to the cloud model will require users to push intelligence away from the data center core and into the layers of the network. Enterprises will seek intelligent edge switches with baked in access control, security, visibility and management. Routers and switches will act as servers that have built-in application-specific firewalls and bandwidth management. This type of manageability will mean the ability to burst up and shrink down bandwidth according to application demand.

Finally, all this shifting in technology comes along with a serious change in culture for networking teams. More than ever before, IT organization silos are fading and networking, systems and storage teams are pressed to work together to enable unified fabrics, virtualization and cloud computing networks. As this transition occurs, networking pros will have to make their voices heard and claim their central role. That shouldn’t be too difficult as networking technology has already surfaced as the lifeline of these emerging technologies.


April 7, 2010  9:47 PM

The top technical books for networking professionals

Michael Morisy Michael Morisy Profile: Michael Morisy


Looking to boost your networking career, or simply bone up on the latest trends and topics in your field? You’ve come to the right place: I’ve polled analysts, authors, IT pros of all stripes and, of course, our very own member community.

Top reads so far (click the title for more information):

Continued »


April 1, 2010  7:00 AM

Your guide to the networking blogosphere

Michael Morisy Michael Morisy Profile: Michael Morisy

If the Internet really is a series of tubes, it’s network engineers who keep those tubes running. But how, exactly, do you keep it running today while keeping an eye on what you’ll need tomorrow? Get the experts’ opinions from our picks of the top networking blogs. Know of a great networking blog we’ve missed? Sound off in the IT Knowledge Exchange forums, where other IT professionals are chiming in with their thoughts.

Editorial Blogs

Independent/Analyst Blogs

User Blogs

Vendor(ish) Blogs


March 30, 2010  6:00 AM

SQL attacks come from the darndest places

Michael Morisy Michael Morisy Profile: Michael Morisy

SQL injection attacks are a constant thorn in the side of security practitioners, claiming the dubious distinction of being the attack vector for the largest U.S. ID theft case ever. And while tools are arriving on the scene to help businesses root out potential problems before the bad guys do, there’s plenty of attack vectors just waiting to be exploited. The latest case? An image floating around the web showing a, er, creative license plate cover designed to foil traffic cameras:

Will it work? Unlikely (see commentary on Gizmodo), but it’s a good reminder that attacks can come from the darnedest places. It’s also a nice throwback to the classic SQL injection comic from XKCD:

As if “smoker doors“, weaponized e-mail and your own PC weren’t enough to keep you worried.


March 29, 2010  11:32 PM

Green IT: Myth or Reality? Help tell the true story

Michael Morisy Guest Author Profile: Guest Author

This is a guest post and request for information by Johanne Murray, a Canadian research student at National Cheng Kung University in the Business Management Department.

The concept of green information technology has been around since 1992; however, like other green products, it has not experienced a tremendous growth rate. Green products in general have not followed a traditional product adoption model.

Stakeholders have now begun to put more pressure on companies to adopt greener technology systems, of which many companies are making the claims they are either in the process or have already done so. However, in my research so far I have not been able to get passed the managers that are making these claims. This makes it difficult to understand the adoption process and the perceptions of personnel.

Although it is of great interested to speak with project managers, directors, CEOs that are mandating Green IT within their companies, it is difficult to base academic research on these claims alone. So far there has been little academic research based on the people, the personnel and management that are working with these newer and greener IT systems.

Are companies really going green or is it something that is just stated to appease stakeholders? Where are the personnel that are adopting these new systems? Are you supposedly using green IT in your work place? Does it make a difference? Does it make your work easier? Was it easy to adapt to? Do you feel there was sufficient resources and education in order to adopt this technology?

This research is attempting to answer these questions; however it has been a challenge finding people that are supposedly using recently adopted Green IT.

Is green IT just a myth? Is it a case of company green washing or are these companies really transferring their technology?

This academic research is dedicated to the advancement of Green IT.

If your company has mandated and adopted Green IT and you are using green computers or other information technology that is more environmentally friendly than its predecessor please take a minute to fill out this survey:

http://www.surveymonkey.com/s/JBG9C2N

Or if you are working for a company that claims it is adopting Green IT and you are not so sure and have issues with their claims please contact: johannemurray@hotmail.com

All respondents’ details will be kept confidential.

If you are a manager/CEO and you are truly proud of your Green IT technology transfer and would like to make your company an example for others to follow please contact johannemurray@hotmail.com to become part of an exciting case study. This would include telephone interviews of a variety of personal affected by the transfer. This is cutting edge research and would be a great opportunity for companies tell their Green IT story to the world.


March 29, 2010  7:00 AM

Weighing the Real Cost of Mobile Broadband

Michael Morisy Guest Author Profile: Guest Author

Should you take the mobile plunge if you haven’t already? While many companies’ workforces are wired with the latest gadgets, IT departments have occasionally been hesitant to jump on board for a number of reasons. Today’s guest post – by Tim Scannell, editorial director of sister site TechnologyGuide.com – outlines why 4G might mean it’s time to re-think corporate wireless strategy.

One strong theme at this year’s CTIA conference, which wrapped up last week, was the evolution of mobile broadband.   Loosely defined, this refers to everything and anything traditional broadband offers, but accessible through a mobile device – in the case of the CTIA cognoscenti, this specifically related to small, handheld systems.

Up until very recently, this has pretty much been a blue-sky concept since there were only a handful of devices that were really capable of providing a rich browsing experience.  Also, the browser software still had a way to go in terms of development, and cellular infrastructures just weren’t up to snuff when it came to fast and reliable service.

All of that is changing rapidly, however.  At CTIA, there were a number of interesting and powerful devices that were capable of operating across emerging 4G wireless networks – like the HTC EVO 4G, that will reportedly be the first smartphone available in the U.S. with built-in WiMAX (which, in many cases, provides much more reliable wireless access than cellular, particularly in congested urban areas). The new HTC system also runs Google’s Android OS and has a very large high-resolution display.

Newer classes of mobile computers – like netbooks – are also catching on in the small business and small enterprise markets, especially as the numbers of mobile workers increase and efforts continue to extend customer relationship management and internal information resources out to the point of customer contact.  The number of online consumers who own a netbook has increased from 10 percent last year to 15 percent this year, with most people using a netbook as a second device and not a replacement to a notebook computer, according to a recent survey.

Tablet PCs are also finally finding their niche in mobile business computing, spurred by interest in the soon-to-be-shipped Apple iPad.  Fifty seven million “media tablet PCs”  are expected to ship in 2015 according to analysts at ABI Research, which is roughly thirteen times the 4 million expected to ship this year.

As prices for mobile system plummet and the wireless infrastructure becomes more reliable and varied with converged connectivity options (cellular, WiFi, WiMAX, etc.), it makes sense for companies of all sizes to have a mobile solutions strategy.  Yes, there are some significant challenges, like mobile management, service and support, security and developing a collaborative strategy.  But the benefits can be huge in terms of getting closer to customers and speeding transactions.

Since every company is different, it is difficult to come up with a ‘one size fits all‘ return on investment (ROI) formula that can quickly validate initial purchases, training, support and other functions.  Focusing too much on the cost of implementation and operations can also be a mistake – especially in a down economy where the mandate is slashing expenses rather than adding to expenditures.

To get a more realistic and long-term picture (as well as convince upper management a mobile strategy is working), an increasing number of companies are instead measuring the efficiencies created by a mobile strategy.  At a major magazine distribution company, for example, the goal is to use mobile solutions to increase the efficiencies of every worker by about 5% – saving about 24 minutes of wasted time per day.  When you translate that savings in time into dollars and extend it across hundreds or thousands of mobile workers, the cost savings can be in the millions, notes the IT director.

The real question to consider then is not how much implementing mobile systems and services will cost, but what the expense will be if you do not take the plunge and make mobile broadband an integral part of your business strategy.

If you’re a fan of Tim’s writing, be sure to check out his soon-to-be-launched blog, Technology Guide Lines, hosted right here on IT Knowledge Exchange.


March 22, 2010  2:28 PM

Google skipfish, 0-day hunter

Michael Morisy Michael Morisy Profile: Michael Morisy

If web apps are really going to take off in the way Google hopes, the Big G knows it needs to tighten up the security holes on web apps at large, no matter how elegant their own solutions are.

Enter skipfish, Google’s automated web security scanner, which was launched Friday by Michał Zalewski in a post on the Google Online Security Blog:

Today, we are happy to announce the availability of skipfish – our free, open source, fully automated, active web application security reconnaissance tool. We think this project is interesting for a few reasons:

  • High speed: written in pure C, with highly optimized HTTP handling and a minimal CPU footprint, the tool easily achieves 2000 requests per second with responsive targets.
  • Ease of use: the tool features heuristics to support a variety of quirky web frameworks and mixed-technology sites, with automatic learning capabilities, on-the-fly wordlist creation, and form autocompletion.
  • Cutting-edge security logic: we incorporated high quality, low false positive, differential security checks capable of spotting a range of subtle flaws, including blind injection vectors.

For those worried that this just further enables malicious script kiddies to hunt out and play with gaping holes in your poorly designed web app (or that budget SaaS vendor your CIO chose), Google included this disclaimer:

First and foremost, please do not be evil. Use skipfish only against  services you own, or have a permission to test.

We’ll see how long that lasts, but at least there’s another (open source, no less!) tool from a reputable company to help catch problems before someone else does. If you’re interested in a second opinion, the folks at Securi Security also took a closer look at skipfish, and left with a favorable impression.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: