I seemed to have a lot to comment on in my last post, and decided it was best to have another post which continued my comments from my list of considerations for legacy application rewrites, so I continue my comments in this post.
Myself, very early on in the review of a potential project I like to look at the tables, any relationships between tables, and any indexes which may be defined for the tables. At one point I had developed a method of estimating which assigned a basic value for each table based upon number of fields, number of relationships, and number of indexes. The more of each the greater the cost to replicate. This was a time consuming analysis, but provided a significantly more in-depth understanding of what I was getting myself into. The problem I had with this method was that I never quite did enough of these to establish meaningful values. The other thing I did (which I highly recommend against!) is to decide I was making it too complicated and therefore misjudging the time it would take! Problem was, estimates using this method, while they produced the work, the work was produced because I was “giving it away” by ignoring what I thought it would take based on the details I knew.
Legacy apps which are not based upon a relational database are of course the most difficult to estimating in my opinion. I believe that to estimate such a potential project one must basically create the tables, relationships and indexes on paper at least. Here creating a diagram of tables and relationships can be a valuable asset as well.
Yet another time-consuming exercise that is best taken on early is an evaluation of the ratio of entry/processing programs to reporting programs. It’s my experience that entry/processing programs generally are more complex than the general reporting program. Of course, I have had plenty of very complex reports which break this “rule”, but they have been the exception.