Think you’ve made a safe storage decision by going with a trusted name? Think again. Today’s guest post is from Roger Kelley, aka @storage_wonk, who is the principal architect at Xiotech, blogger at StorageWonk.com and one of our featured storage Tweeters.
Of all the challenges faced by Information Technology (IT), purchasing SAN storage for the data center is one of the biggest. Cost, criticality, and complexity are the three C’s that all too often impact the fourth C: career. IT is quite used to making tactical purchases in the form of servers, routers, desktops, and the like. SAN storage acquisition is harder because it is inherently strategic in nature and therefore poses greater long-term risks/rewards for the company at large. It’s interesting that most companies tend to follow a similar path when the order comes down to evaluate storage for purchase. They form a committee that then draws up a list of weighted requirements for both hardware and software features and functions. They then invite several storage vendors to the party to offer their pitch and from that pool of information, the committee fills out a requirements list in an effort to evaluate each vendor in an unbiased and “left brain” sort of way.
The perplexing part comes in when, after all this effort is expended and the players are evaluated, IT decides to go with a large vendor purely because “Nobody was ever fired for buying _____!” Though they may think they’re playing it safe or smart, what it really indicates is how little confidence they have in their own ability to select a storage vendor based upon the technological merits of the vendor and the unique business drivers of their own organization. People who make storage decisions based upon fear are, in effect, saying, “When it fails I can always justify my acquisition to senior management by hiding behind the perceived reputation of the market leading vendor because that’s what everyone else buys.” This lemming-like behavior can prevent a company from benefiting from newer technologies that might significantly help their bottom line. Given the quality and innovation of smaller companies, this type of thinking is an unnecessary hindrance.
Truth is, there are excellent reasons to look beyond the so-called market leaders in storage and see what the smaller companies are up to. These smaller companies can’t outspend the big ones on marketing so they have to focus on out-innovating them in technology. This innovation has recently led to some amazing technological breakthroughs that the big boys simply can’t offer, such as self-healing arrays (not “fail in place” but truly “self-repairing”), unheard of real-world performance gains without expensive SSD (no, really!), hugely scalable architectures that actually scale, and unmatched levels of reliability.
The problem with the big guys of storage is that they get lax and stifle innovation in favor of playing it safe and keeping the revenue stream rolling along. The smaller players simply cannot afford to be lax, they have to innovate to stay alive and this bodes well for customers who are looking for “best of breed” options for their company. But to take advantage of these innovations company management has to empower their IT staff to make the right technology decisions for the company instead of putting them in a circumstance where they feel they have to make the safe decision for their own careers.
But some may ask, “Isn’t avoiding difficulties precisely why you should buy from the big boys?” Not really! First of all, hardware, from all vendors, can— and does—experience difficulties. No hardware manufacturer can provide 100% perfect operation, 100% of the time, for 100% of its customers. Glitches arise, hardware fails, configurations get mangled due to human error (both vendor and end-user). In short, life happens for everyone. Second, smaller vendors tend to be much more responsive and nimble when difficulties inevitably arise simply because your business means more to their bottom line than it does to the industry giants. If you need a resolution to a problem with a giant company, it can often take 6 months to get a “no”. “Yes” can take considerably longer.
In the end, whether or not your chosen storage vendor is successful in your environment is more about architectural design than brand name. True, a properly architected storage network tends to be more expensive—but then so is downtime! Buying storage based upon technological and business drivers, and spending a bit more to do the job right, are significantly better strategies for company and career than doing things half-way and hiding behind a name when things fail.
Here at ITKnowledgeExchange.com, one of the evergreen questions we see in the IT answers forum is the recent IT grad asking for advice on what to do next. With so many possible certifications and paths available, figuring out your course of action is a daunting task. Never fear: At ITKnowledgeExchange we have members with decades of experience to provide their two cents to those in need of IT career guidance. One of our newer members, ITstudentGrad, asked for suggestions of good entry level IT jobs. He’s received some great advice from members who were once in his shoes.
Tip 1: Exhaust your resources.
As our very own EmNichs points out, bloggers like Ed Tittel offer advice such as 7 Questions for Highly Effective Career or Certification Advice. There are myriad resources for jobseekers in all industries, and IT is no exception. Whether you’re searching for general career advice or a more complete picture of specific areas in IT such as business service management, cloud computing, or IT consulting, more and more IT professionals are jumping on the blogwagon to share their insights, predictions and experiences across all areas of IT.
Tip 2: Start at the beginning.
Shanekearney, a network engineer in Ireland, has a DIP in Computer Science, CCNA (Cisco Certified Network Associate), MCSA (Microsoft Certified Systems Administrator), MCP (Microsoft Certified Pro), A+, ECDL (European Computer Driving License) and is currently studying for MCSE (Microsoft Certified Systems Engineer).
His advice for ITstudentGrad is to get a taste of every flavor of IT: “You should start in service desk tech support as it deals with so many different areas and allows you to gain much needed real world work experience.” Despite Shane’s complaints about his position, he assures that it is a good jumping off point to get to a more technically demanding position. In a hands-on industry such as IT, nothing teaches better than real-world experience.
Tip 3: Change your mind.
Stevesz began his career in IT in the late 1970s, advancing from part-time to full-time for over two decades. His company, LAN Doctor, Inc. lends him experience from “break/fix” tasks to computing strategy and network installation and maintenance. His advice is to go after what interests you, but also what will sharpen your skills and knowledge. Get experience troubleshooting hardware and software problems at a help desk position. Don’t approach anything without a dose of flexibility; change your path as your interests change.
Stevesz shares his own story of changing interests: “I started out as a dBASE III programmer (if anyone still remembers dBASE at all) then actually went to a help desk type of situation for a company that had its own propietary database program. From there I went to a company that basically did break/fix work. We also branched out into servers and networking, and other things surrounding computers, concentrating on small, very small, businesses, and I am still at that company, as a partner in the business.”
Always keep feelers out there and if something strikes your interest, send a query! IT is an industry constantly in motion, requiring you to be just as fluid in your pursuits.
Tip 4: Give a little, get a little.
Labnuke99 is an IT manager at a company that designs, manufactures and sells electronic components and assemblies primarily to original equipment manufacturers (OEMs). He provides support for international and domestic data network and security. Labnuke99 advises searching out a symbiotic relationship with a non-profit organization; what the work lacks in remuneration will be made up for in other ways. Learn how organizations require and use IT services on a tight budget.
Aside from gaining invaluable real-world experience and making contacts as an IT professional, you can get satisfaction from giving charity and advocacy organizations the help they need. Treat any volunteer prospect like a potential job: Send along a polite inquiry and resume along with your motivations for helping out. Be sure to detail your availability; promising hours or skills you can’t deliver will only hurt you in the end. Do your research beforehand and outline specific areas where you’d like to help them improve: fix their confusing website, for example, or ask them for what IT headaches you can help them with. Build your resume, connections, skills and karma points all in one fell swoop.
A couple weeks ago, I read an article over at Tech News World that got me thinking about endpoint security, and how it has become like a spy movie, where the biggest threats are often coming from the inside. We asked you if your endpoint security focus was shifting, and how you’re managing that. And you answered…
Technochic’s company has disabled USB ports, CD writers and implemented strict mobile policies when connecting to email servers. But it’s worth it, she says, because with a little awareness and a plan of action, endpoint security can protect against both internal and external threats.
Jinteik’s company, in addition to disabled USB and CD/DVD drives, has locked up the BIOS. They don’t allow OWA and have strict printing permissions policies. Vendors can’t connect to their networks, nor are there any wireless devices in their office.
TomLiotta’s company has taken into account the need for USB ports to connect devices such as a mouse or a keyboard. Aside from antivirus, they audit regularly via automated monitoring, authenticate and authorize according to position, and maintain a security policy. They’re in a unique spot as a software vendor of network security, meaning their employees are more knowledgeable than most on how to cause trouble. Their solution? Focus more on quality employee relationships and education rather than software and hardware obstacles. He makes a good point:
Fundamental safeguards will always be in place. This protects from mistakes made by the best of us. But clear authentication combined with authorizations that are capability- and object-based, for employees who have a solid relationship with their employer and who always have access to a good security policy, into systems with strong monitoring, all tend to make most issues disappear.
Chippy088, or David, agrees with Tom, and highlights the way many obstacles can be circumvented. Active Directory controls are in place primarily for normal users, and disabled physical ports can sometimes be accessed in safe mode. His suggestion? Virtual machines:
[They are] 90% more effective in controlling users trying to bypass security controls, as the local physical devices are not used in saving/printing.
What does bother him though are mobile devices, and finding the balance between control and flexibility for utilizing off site access points.
Mitrum got right to the point:I disabled USB ports and CD/DVDROM, bluetooth, micro SD, MMC, etc. in my organization.
Do some of these strike you as too strict or not strict enough? Share your thoughts and your own endpoint security policies in the comments section!
It’s the interior design of IT enterprise, but someone has to do it. Rethinking and – ultimately – redesigning your network to wireless should happen in several phases, the first of which is planning. Like my father says, Planning prevents piss poor performance. The 5 Ps, if you will. This month we’re focusing on wireless networking and the steps necessary to untangling you from the wires of yesteryear.
Why WiFi (WhyFi?)
In general, creating a wireless network allows for greater flexibility, from in-office mobility to network configuration. Wires can limit signal strength and hinder the reorganization and growth of your network configuration, making costs saved from increased productivity and the ease of reconfiguring your network an important consideration. Lisa Phifer of Search Networking reports that “even ‘average’ companies [that invested in 802.11n] reported 114% growth in WLAN traffic, a 60% increase in wireless coverage throughout their offices, and a 44% reduction in downtime.” Can you afford anything less?
A walk through the WiFi technologies
Today’s industry standard, 802.11n has improved upon the bandwidth of previous standards by using multiple wireless signals and antennas or MIMO (multiple input multiple output) technology. It can handle data rates up to 100 Mbps with a better range and backward compatibility with 802.11g. There was definitely a build-up to the 802.11n, which features the fastest maximum speed, best range and most resistance to interference.
The follow-up to the pioneering 802.11, 802.11b offered a respectable 11 Mbps, similar to a traditional Ethernet. Because the radio signal frequency was unregulated, there was room for interference from microwaves and cordless phones utilizing the 2.4 GHz frequency. The follow-up 802.11a had an increased bandwidth of 54 Mbps. The regulated frequency, around 5 GHz, solved the interference problem but introduced a shorter range easily obstructed by walls.
Enter the 802.11g, birthed from the attempt to combine the pros of 802.11a/b, with a bandwidth of up to 54 Mbps. Backwards compatible with 802.11b, it operates on the 2.5 GHz frequency, making interference a consideration. The higher cost was worth it for the combination of a fast maximum speed and a wide range.
Avoiding Piss Poor Performance
Now it’s time to remember the 5 Ps and begin planning. It’s important to understand that an overhaul such as the wired-to-wireless transition can be too much for an all-at-once solution. Knowing what level of upgrade your company is capable of is the first step. Search Networking’s Lisa Phifer suggests that “upgrades must be budgeted and scheduled over time, resulting in an incremental network infrastructure migration.”
Consider the hardware you’ll have to replace such as Fast Ethernet switches (with Gigabit Ethernet switches). Because 802.11n uses more electricity than previous access points, you’ll need to consider an upgrade in switch ports to provide more power. Get in on the 2.4 GHz versus 5 GHz debate; do you need the wider range provided by 2.4 GHz or will you deploy dual band wireless LAN APs? If you decide to deploy both frequencies in your network, consider building the capability to guide clients to connect in 5 GHz when possible. Search Networking’s Shamus McGillicuddy suggests Cisco’s BandSelect or Aruba’s “band steering feature in its Adaptive Radio Management software.”
Sit down and do this:
- Try to predict the traffic load that wireless will deliver to your wired network and how it will grow over time.
- Plan for refresh cycles as clients and your workforce migrates to 11n devices.
- Plan and design your WLAN design and configuration to solve bottlenecks before they occur.
- Upgrading from Power over Ethernet to MIMO will require more power; consider this when planning upgrades.
- Be sure that legacy tools such as WLAN analyzers are upgraded and prepared to work with 802.11n.
- A larger network with increased capabilities such as the 802.11n also requires better security with increased monitoring. In order for your wireless network to meet its full potential, it must be fully secured and monitored from the get-go with 24/7 alerts of unwanted connections to the network.
In IT there is definitely a time and a place for the trial and error method, but when you begin migrating your wired network to wireless, avoid trial and error for maximum effectiveness and minimum headaches.
Wondering when you should update your wireless network from scattered unsecured “hotspots” to something a little more … serious? Then you’ve been wondering a little too long, according to today’s guest author Craig Mathias. Mathias is a Principal with the wireless and mobile advisory firm Farpoint Group. He can be reached at firstname.lastname@example.org. -Michael Morisy
With the IEEE 802.11n standard taking so long to develop (about seven years!), it’s no wonder that many are still having hard time with the fact that, yes, the standard is done, and all major WLAN system vendors are now shipping .11n products. In fact, many are reporting that 802.11n is now the bulk of their sales. Farpoint Group has, since the release of the interim 802.11n spec from the Wi-Fi Alliance, in fact been recommending nothing but 802.11n – it’s got so much higher performance in any dimension than any previous standard, from throughput to capacity to range, that there’s no point in riding the older horses anymore. OK, .11n is a little more expensive, but it has vastly improved price/performance – perhaps the vendors don’t want you to hear this, but it’s really downright cheap to install a .11n infrastructure, given falling prices due to competition and much lower components costs.
So it might surprise you to learn that the key issues (and, yes, there are some) surrounding 802.11n have nothing to do with the maturity of the technology or the price to get into the game, or even the radios required. The arguments have moved – here’s what are at the top of the list for many IT managers today:
- Operating Expense (OpEx)– Whereas the capital expense (CapEx) required to buy that new .11n system is nothing to sneeze at, most of the life-cycle cost in any large-scale WLAN deployment is in OpEx – management, maintenance, help desk, etc. And, of course, the reason for this is that so much of the work here is labor-intensive, and that labor is highly-skilled. This is why we always recommend that end-user organizations carefully examine the management capabilities of systems under consideration. There are often major differences in functionality here between different products, and it’s important to verify functionality against one’s list of requirements, including interface points with other operational systems, in order to minimize OpEx and thus total cost of ownership (TCO) down the road. As it is with end-users, productivity is the name of the game in operations as well.
- Assurance– Finally (or not; we’ll get to another cautionary note shortly), it’s important to consider a wireless LAN assurance solution in planning your new .11n deployment. These systems perform a broad range of functions – regulatory compliance monitoring, intrusion detection and prevention, security audits, and even provide, in some cases, such vital services and spectral analysis to detect interference. There is a slow trend afoot to integrate assurance services into core WLAN management systems, and that makes sense. But you may want a system entirely distinct from your WLAN infrastructure, in a belt-and-suspenders fashion.
Of course, there’s more. I’m particularly interested in convergence functionality, for example, to allow handoffs between cellular systems and wireless LANs. And while I noted above that differences in radio performance alone aren’t the gating issue that they once were, it’s still vital to do a little testing to make sure that you’re getting the performance you need. And attention must be paid the rest of the network value chain as well, particularly in terms of deploying gigabit Ethernet – .11n will swamp fast Ethernet without even trying very hard.
A full checklist could take several pages, but I hope my core message is clear: 802.11n is all you should be considering at present. The benefits are there, and the issues, as I noted above, have shifted. No matter – it’s full speed ahead, pun intended, to 802.11n.
Windows 7 migrations…Yeah, yeah, yeah – what a boring topic. Be that as it may, boring doesn’t mean unimportant. If you’re going to move forward with Windows 7, you might as well take the time to do it right. If that’s not enough incentive how about this: Time management experts have found that one minute in planning can save 5 minutes in execution. Now that’s some ROI you can bank on!
But where do you start with Windows 7 migrations? As I wrote in a previous post, Microsoft has their own toolkits. However, before you even go down that path, TechTarget has several good resources on the subject worth checking out. Here’s the rundown:
Finally, here’s a piece I wrote last summer on why it’s important to lock down XP before moving to Windows 7:
Sure, these topics aren’t all that glamorous but they sure can help you set yourself and your business up for success with your Windows 7 migration efforts.
A few months back, Microsoft extended the Windows 7 Enterprise Eval 90-Day Trial. We now have until the end of 2010 to take the “new” OS for a test drive.
As you’ve likely experienced, there is no better way to learn about a new OS or piece of software than trying it out for yourself. That’s what I did with Vista and was quickly turned off. That’s what I did with Windows 7 and was easily sold.
The Windows 7 Enterprise 90-day trial will shutdown every hour after the trial period has ended…a MAJOR incentive to fish or cut bait – especially when you’re in the middle of something and lose all your work! Not that that has happened to me. Ahem…
After that, it’s up to you to decide if Windows 7 is right for your business. I’ve said it before and I’ll continue saying it: I believe Windows 7 is here to stay. All of my clients that I perform internal security vulnerability assessments for are at least testing Windows 7 in the enterprise. From what I can tell most are considering moving forward with it and eventually phasing out XP.
You owe it to yourself – and your users and your business – to at least try Windows 7. It’ll take a little getting used to but I’m pretty confident you’ll like it.
It’s been a year since Windows 7 RTM officially came to the market. My, how time flies. I feel like I was just getting to know it! Anyway, I came across some recent news bits and posts about Windows 7 that I thought were fitting.
Hardware manufacturer Asus ditches Windows 7 for Android on its upcoming Eee table computer. Interesting. I suspect other manufacturers have the same plans in mind. Maybe not for run-of-the-mill laptops and desktops but at least tablet devices. At least HP appears to be staying the course. Regardless, I never thought I’d live to see an OS that has the potential to push Microsoft aside.
Windows 7 SP1 is set to be released next year. Next year!? I’m not complaining, it just seems that two years is a long gap between the initial release of the OS and its first service pack. Maybe its a sign that Microsoft has finally gotten past its Vista ways.
One of my favorite new posts is on 15 keyboard shortcuts for Windows 7 – most of which I wasn’t aware of. What am I going to do with the extra 2 minutes I save each day by using these? Perhaps I can talk myself into leaving the office a little earlier.
Finally, one more thing that’s slightly off topic but still affects those of us working in the world of Windows is this bit on how Microsoft is giving all employees a new Windows 7 phone. Certainly a great way to get buy-in and spread the word on a device that Microsoft has otherwise given minimal attention up to this point. Is the Windows 7 phone another case of too little too late from Redmond? We’ll see.
Monday morning, Lockheed Martin did a funny thing: They released a major bit of enterprise social computing software, dubbed Eureka Streams, to the open source community. It’s a little bit like Yammer, a little bit like an Intranet and a little bit like Facebook, but not really a bit like what we’ve come to expect from our nation’s military suppliers, which have traditionally been pretty tight-lipped about what they’re building, what they’re charging and how you can use it.
Lest things get too weird, at least the promotional YouTube video is full of dull, comforting marketing drivel:
[kml_flashembed movie="http://www.youtube.com/v/uhefaGKRAkA" width="425" height="350" wmode="transparent" /]
As it turns out, there are a lot of open source enthusiasts within our nation’s military-industrial complex, and just like in big business, open source is starting to find its own profitable, sustainable niche within military suppliers and the nation’s military itself. For proof, just look at the upcoming 2nd annual Mil-OSS Working Group conference (suits and ties strongly discouraged), which will feature almost 50 speakers on topics ranging from “Using Git to Overcome Traditional VCS Limitations” to “OZONE & OWF: A Community-wide GOTS initiative and its transition to GOSS.”
Not surprisingly, there’s some skepticism. Dana Blankhorn writes:
To what do we owe the honor? Have the people sworn to protect us from fanatics in caves suddenly gained open source religion? Are they trying to ingratiate themselves with a new Administration which looks favorably on open source? Or are they trying to take it over, infiltrate it?
The answers to these questions are important, as is your speculation, because the welcome these projects get from the open source community will likely determine how much help they get. Reputation is vital in open source, and government often has a poor one.
Then there’s the quality of the offering itself. I don’t see anything in Eureka Streams I can’t do in Drupal, or a number of other high-quality open source projects that have existed for years. Lockheed has reinvented the wheel — why? And why should I help them push it up the hill?
Skepticism: Constructive. Unfounded speculation: Less so. Cheap potshots: Grow up.
Let’s look at the facts:
To what do we owe the honor? The national security industry has been making serious contributions to open source software in one way or another for a long time, and Dana’s reaction isn’t atypical. As Gunnar Hellekson recalled, the same skepticism greeted the NSA’s contributions to SE-Linux, many of which were later vetted and pulled into the kernel.
Reputation is vital in open source, and government often has a poor one. Recently I’d say it’s the opposite, and we’re finally starting to see the fruits of what people have suggested for years: That government and OSS should go hand-in-hand. See Whitehouse.gov. Or better yet, check out this post by Sun’s Bill Vass, written two years ago, pre-Obama:
Just recently, the House released The National Defense Authorization Act for Fiscal Year 2009 (H.R. 5658) which includes language that calls for all DoD agencies to consider open source software when procuring manned or unmanned aerial vehicles. Including such language is a milestone for the open source movement and just the beginning!
Joab Jackson of Government Computer News wrote this in his blog, “The Defense Department has traditionally been somewhat wary of OSS, at least for official duties. So some feel the language could pave the way for greater acceptance within the Defense community.”
I don’t see anything in Eureka Streams I can’t do in Drupal. Coming from an affirmed Drupal enthusiast (and proud member of the Wicked Pissah Usah Group), yes, you can do that in Drupal, just like you can build pretty much anything with it, but that doesn’t mean you can build it well, or easily, or without more re-inventing the wheel than makes it worth your while. That’s why we have Joomla, WordPress, Open Atrium and now … Eureka Streams, which actually appears to do a lot that none of those other platforms can do without a lot of work.
But at the end of the day, what this story comes down to is that the economics of open source are the same for Red Hat as they are for IBM as they are for Lockheed. Lockheed isn’t open sourcing Eureka Streams because Bildenbring and the Illuminati are planning on stealing your SMB’s Intranet data. It’s open sourcing it because in today’s worlds, there are plenty of great business reasons to open source your software, particularly if your primary product is hardware (like missiles!) and not corporate Intranet platforms.
The same economics apply to the government and the military, too: Every wheel not re-invented because of forge.mil means more money, time and energy spent focused on protecting the country or reducing taxes (Well, that might be taking it a bit too far).
Let’s cut the open source polemics, just like the enterprise has. Open source projects require backing; sometimes that will be an active community, but just as often it will be the IBMs, federal governments, and even, yes, Lockheed Martins of the world.
Our Windows 7 in 2010 month is wrapping up, so we thought we’d do a little look back through the years.
[Click on images to enlarge.] Uploaded with ImageShack.us
Remember this guy? Windows 1.0, in all of its 16-bit glory, was released in November of 1985. This OS—available only on floppy disk—was supported until December 31, 2001. System requirements for Windows 1.0: 256 KB RAM, DOS 2.0, and two floppy drives. Continued »