Posted by: Brein Matturro
I remember very distinctly, the day I was told about a new technology that would someday be commonplace. It was a perfect weather day in northern Alabama, 70 degrees and not a cloud in the sky.
This “new technology” was a common database and the most efficient language on every platform that every system automatically understood and could execute. I was standing outside of a building in Madison catching a cigarette, and to say the least, I was skeptical.
Our PC tech was a big burly guy who “sort of” knew what he was doing. I never said anything, but whenever he got over his head, the big boss would meet me outside in the smoking area and get the solution to the problem from me, so as not to embarrass him. His father was an old timer with the company and could probably have had some political pull if we had embarrassed him, so we did what we could to keep the peace.
Back when I smoked, I made the trek outside a few times a day. This particular day, Mr. Old Timer was outside sitting on the steps in the shade reading a trade magazine. It had an article that was all about the future of the Internet. Mr. Old Timer took great pride in attempting to one-up me any chance he got. He was preaching to me about how the path to the future would lead to every single operating system in the world running a single programming language.
He boasted about the author who knew everything there was to know about IT and how we “kids” knew nothing. The author stated very clearly that this new language would be the most efficient language on every machine in the world.
I spent my 10 minutes listening to him spout about how every machine and operating system in the world was so much better than anything IBM had, and how much of a dog OS/400 was. I laughed, smiled and flicked my cigarette butt about 30 feet and right into the ashtray as I pulled open the door to go inside without saying a word.
Now, I said all that to say this: Back when I started programming, almost all programs were interpreted. Internal program data was clear-formatted text with tags to identify what the data was and how to use it. We entered a basic program as text file and saved it to tape. Then we ran it using the interpreter and the tape drive. We built these wonderful tags that we used to identify program data read from an external file so that we didn’t have to store the data in the program because we didn’t have much memory. We had a limitation of 8K for any runtime module including loading of the source.
Some of us were lucky enough to have a 32K expansion interface where we could store most of the data file information.
Later, when we had disk drives and much more memory, we could write real programs and actually compile them into an executable. We stored data in packed formats to save disk space and compiled our programs so they ran fast. What a world it became. Machines kept getting bigger and faster and those of us who knew how to save space and write compact code with small databases were kings. Then came the hardware revolutions. Memory got cheaper and disk space was almost thrown away. Why pack your data and compress out blanks? Disk space is cheap…
Now look where we are. Not much is compiled and compiling really doesn’t do much for efficiency. It’s not unusual to have programs that are larger than my first fixed disk drive, all 2 megabytes, partitioned into multiple drives, of course.
I guess the guy knew what he was talking about because certainly there is only one operating system, Windoze, and only one computer language today, Java. Go figure. I guess when they say “what goes around, comes around” they aren’t kidding.
Of course, I wouldn’t know. I’m just a flunky programmer.