Thanks to a regular correspondent and professional associate of mine, I’ve been involved recently in an interesting discussion about college degrees versus IT certifications. I wrote another blog here last Friday in which I opined the following: “for at least some certifications (most notably, the CCIE, in his case), they are almost analogous to a college degree.” Equivocations aside, my colleague cried foul (using pithier language, often abbreviated as “BS”) on this remark, so of course I immediately leaped to my own defense.
First and foremost, I did equivocate in this remark by using “at least some” and “almost analogous” so I don’t think I can be accused of saying they’re exactly the same. But as I think about it further, and compare the time, cost and effort involved in earning a typical IT certification (NOT the CCIE or other premier credentials), I have to concede that my colleague is right. Even an Associate’s degree usually requires 60 credit hours of classes, each of which costs no less than $60 at local community colleges, plus books, fees, and other costs that probably add up to no less than $5,000, and may easily top $10,000. Then, too, 60 credit hours usually means 720 hours in the classroom and probably as many more in getting back and forth to school, with another 700-1,400 hours for labs and studying. A bachelor’s involves at least double this effort, and private schools usually charge $200 and up per credit hour rather than the bargain basement $60 you might pay at a community college. A master’s degree may involve only 30 hours of classes, but you must pay more for those hours, and usually write a thesis (or take more classes in lieu of same). As for a PhD: three years of classes, plus one to three more years to research, write, and defend a dissertation, for at least three times the nut associated with an Associate’s degree, and very often much more than that.
So yeah, now that I re-think my position and really look at the numbers for the time, money, and activity involved in earning an IT certification versus a college degree, I have to concede my colleague was right. It would have been much more accurate for me to say “For a very few IT certifications, such as the CCIE, the time and effort required to earn the credential might get close to the work required to earn a two-year degree, but otherwise, degrees trump certifications for time, effort, and cost.”
Does this mean that degrees are more important than IT certifications? This is an interesting question, to which the answer must be “It depends on what field the degree is in, the granting institution, and how old the degree is, versus the currency and perceived value of the corresponding IT certification (or more probably, certifications plural) under consideration.” Degrees never expire, they indicate a sustained effort to learn a curriculum and meet specific graduation requirements, and tell employers that candidates could complete such an effort. Certs, on the other hand, do expire, are more narrowly focused, take less time, effort, and money to complete, and are more like the merit badges to which Tom Hollingsworth compared them in his recent blog that provoked my less-than-accurate comparison of these two things last Friday.
And while this doesn’t mean that certs are worthless, it actually helps to explain why employers usually prefer job candidates to have both degrees and current IT certifications: the degree to indicate an ability to raise the money and do the work to complete a sustained and demanding set of activities, and the certifications to indicate some degree of interest and ability in IT-related topics germane to a particular job or job role that the employer wants to fill.
‘Nuff said, I hope!