I’m working on a code project called the bowling kata, mostly as an exercise to teach my fingers to think in ruby. I also have a brand new macintosh, and I thought the experience of re-installing all the command line tools might be fun, and even generate some material.
The generate material part turned out to be true. The fun … not so much.
I stall started when I wanted a unit test framework, and tried to install Test::Minitest, which I read about it Practical Object Oriented Design in Ruby, sometimes called “POODR.” (Fantastic, short read, by the way)So I googled “install Test::Minitest.” The fourth google search result, to the Minitest Rubygems page, told me to use:
gem install minitest
I did that and got:
ERROR: While executing gem … (Gem::FilePermissionError)
You don’t have write permissions for the /Library/Ruby/Gems/2.0.0 directory.
Googling the error message sent me to this stack overflow page.
Apparently OSX 10.10 (Montain Lion) comes with it’s own version of ruby. I could sudo the install, but that might install various things as root and screw up permissions for my user who will be running the ruby stuffs. So I looked into a package manager, rvm.io.
Except, of course, my mac does not have gpg installed, so the first round of the install failed. Googling for “RVM install mac”, I find that I need xcode (which I would have put on sooner or later), but that I also need the unix-command like tools that xcode installs … which xcode no longer installs. But that’s okay, because I can can install it from the xcode 4.4 preference panel: preferences->downloads->Components->Install.
That didn’t work; the website is probably old. I googled and found I found get command line tools other ways. Which, of course, won’t be enough, because I need to install a compiler (I will need that eventually for watir), and to install the compiler I need Macports, which I don’t have yet.
The Hard Truth
All these open source solutions are built on top of a series of dependences that work if installed correctly on the right versions. Each new version of the operating system, especially a UNIX-y operating system, will come with different versions of some of the dependencies, likely with incompatible versions – and the current version of the tools can be incompatible. (That is, the combination of the tools worked, at one time, for one specific combination.)
Anyone who had the old problem might blog about it, but that solution might not be appropriate for the current version of the software.
The result is that the demo at the conference, or user’s group, or screencast looks pretty sweet, but there is a substantial barrier to entry.
What people actually do
Everyone has to struggle through this. Once it is done once, you set up a default virtual machine with everything that you need, then store that in a library somewhere. Ideally, you could use a tool like bundler to make a bundle, then attach the dependencies to the project in github. All new webservers, development and test environments can pulled from this “canonical version” of the software that “just works.” If programmers want to work on their local machine, they likely have a very standard OS/machine combination, and, for that shop, a single-page list of install direction that just works.
Some Charles needs to figure this out once. If you have to do it yourself, congratulations, you get to be charles. Keep good notes.
The result of all of this is that getting up and running is a bit of a rite of passage that weeds out the wanna-bees and people who simply don’t have the time to learn all the things.
I’m not complaining, at least not for me. This stuff is free. I have two degrees in technology, and the life energy and time to figure this out. (For those of us who deal with the intersection of writing, training, and technology, you might say the problem with open source documentation is an opportunity.)
Still, the next time someone complains about the lack of the disadvantaged in technology, I’ve got say:
First: I get it.
We have work to do.