They often start with the statement “By the Way…”, or another flavor is the “While you’re here…”, or maybe the “I was thinking…” flavor — they are in a class of communication sometimes laced with peril, sometimes opportunity, and sometimes just a lot of work — but ALWAYS revealing! In my experience I find that after a barrage of “By the Way”,s at a clients it takes me a bit to recover. “What just happened?” is running through my mind, followed by “Did I get everything done that I needed to?” Usually I need to reflect upon the “By the Ways” after a period of decompression.
Today I spent 5 hours at a client site and since I was prepared for the inevitable “By the ways” I was not taken by surprise when they came. One thing I love about “By the Ways” is how the question or comments following them reveal the needs or desires of the client user community! These expressed “By the ways” may indicate shortcomings with the network, application or processes themselves. Regardless of their nature on any given visit, it presents an opportunity for the service provider who is keyed into hearing the needs of the client their servicing.
A provider can leave a “hero” — who wouldn’t want that? For the developer hero they may provide new work as the “I’ve been thinking…” becomes a “Let’s do it!” Perhaps a better title for this post would have been “Be Aware of the “By the Way”,s 🙂
News has been spreading throughout the press about the case of Rajendrashinh Makwana, a contractor with Fannie Mae who is accused of planting a “Logic Bomb” into the FNM environment which was designed to destroy data throughout the FNM network. I’m sure we’ll be hearing much about this case as time goes on and details continue to be revealed. That this individual as a contractor at such a large operation had the “root” passwords simply amazes me (as well as scares me!). As the story goes it appears that it was shear luck (..or perhaps Divine guidance) that the destructive script was located.
Of course, I’m a contractor for my clients (all small), and I like Mr. Makwana, know their “root” and Admin passwords since I couldn’t do what I do for them without that knowledge — but here is a case where size does make a difference. If in fact Mr. Makwana’s script could have destroyed access to some 4000 network servers I would have to question why any individual, employee or contractor would be given such “power”. I would think that at the very least it would be “spread out”.
Some of what I’ve reviewed so far about this case also has made reference to a sense of “any” large organization having security with their backups — well, I don’t know about you but I wouldn’t want to have to resort to the backups! Even if not a byte of data was lost, the loss from the disaster and downtime would be huge, certainly NOT what we need in our faltering economy! As for backups — my experience generally has been that they are questionable at best — i.e. When were they last tested? In many case I know of the backups never were tested.
For a small business the loss can be managed, sort of. However, given the size and industry of a Fannie Mae any loss can mean significant costs to recover. Here we have another indication of vulnerability – sure am glad the other tech “spotted” the script! Whether we like it or not, we humans inhabiting planet earth are vulnerable and dependent upon each other in more ways than most of us are comfortable with.
A post here in these ITKnowledgeExchange blogs that recently caught my eye was this one written by Stephen Harris entitled “Data Challenges Can be Solved With Business Intelligence“. It is a rather lengthy post touching on several points about data challenges and BI. What I immediately latched onto in his post was what he refers to as a motto – “Thou shalt know thy data“.
While I have never phrased my firm belief in knowing your data in the way he does, I certainly agree that knowing your data is an absolute must. Furthermore, his reference to cleansing, auditing, securing, managing and refreshing data is also an essential ingredient toward any meaningful reporting – never mind the special requirements for an effective BI implementation.
Once again I find myself “down sizing” information and ideas I read about to the needs of the businesses which I service, the small ones. I’ve blogged recently about reporting requirements in these economic times, and certainly “…having information about your business at your fingertips…” is critical, not just a “nice to have”.
Reporting, BI and data “cleanliness” all depend to some extent upon the normalization of the data. I can’t imagine trying to normalize a database without knowing your data. If you would like a quick introduction to the topic of normalization I found “Introduction to Data Normalization: A Database “Best” Practice” to be an excellent place to start.
As with so many areas in development there are multitudes of tradeoffs which come into play with the design of a database. It is absolutely critical that the developer know and understand the data pieces (fields) and how they relate, but just as critical is that the developer understand the reporting requirements and other characteristics of the data, the database itself, the network and hardware platform, and “how” data will be queried. Many speed issues can actually be caused by a database which has been normalized to such an extent that in order to provide the reporting required in an acceptable time span many extra steps are required to prep the data for the presentation sequence desired.
The more up close and personal a developer is with the data the greater the opportunity there is to evaluate the data quality. After there have been a number of changes in the form of additions and subtractions to fields or tables in the database it is a good practice to review the design again to determine if there are changes that should be made to further normalize the database. My experience indicates that often changes are desired.
My previous post pointed out a few of the reasons why one might choose to create a custom application. For this post I’d like to further the discussion to review some of the reasons (valid and otherwise) for NOT creating a custom application. I would include in this category:
- FEAR – Many a story has been told about the “dream” development project that became a nightmare – obviously what there is to do here is to carefully select developer(s)
- It will be “Too Costly” – It doesn’t have to be in most cases
- “Standards” won’t allow it
- Too big of a commitment
- No “New” requirements are on the horizon
- Suitable “Off-the-shelf” software for your industry is available and within your budget range
- Your operation is running efficiently, programs being used suit your needs
- Users are getting what they need from the existing system
- Existing system is technically “up-to-date”
Certainly if the last four points above hold true for your operation one would be hard-pressed to create a valid case for a custom program – you just don’t need it! As for the standards issue there’s not much I can say about it. I have seen companies that have struggled with exceptionally ineffective and inefficient software – only because a proposed app won’t “fit” the “Standard” – I have to seriously wonder about this decision. In some instances it seems to be that the “Fear” or “Too costly” or “fear of commitment” arguments come into place.
Next Post — “Identifying Potentially Valuable Custom Applications”
Without fail, in my experience as a developer any potentially valuable custom application project has been the result of identifying some area of operation which:
- Has become unwieldy, requiring too many steps or multiple data entry
- Has become more and more demanding as a result of increased volume
- Always seemed to be behind time – trying to play catchup and not succeeding
- Required information availability not currently or effectively available
- Just “seemed” to take too long to accomplish
My post Application Value is an excellent example of creating a custom application because of unwieldiness. Another application which I designed that provided for scanning and categorizing of tax exempt certificates was an application which solved all of the issues mentioned above.
Availability of information is a key factor in identifying value – i.e. An operation may be required to maintain original records to produce during audits, yet, that same information may also be required at out-lying locations. The choice can be to maintain a file at each location (trying to keep it up-to-date at all times of course), or somehow provide the means to quickly access the information via a custom application. “Scanning and Cataloguing” such information can provide the ready access required to solve the problem – and there probably isn’t an “off-the-shelf” program to help!
Bottom line to identifying potentially valuable applications — Review operations closely to identify any of the categories listed above. Once identified – it’s time to get creative! 🙂
First — In the interest of public discloser I must confess that perhaps I am biased on the subject of custom applications. While writing my previous post I found myself once again becoming excited about application development, and more especially the value that such an application can bring to an operation. I suspect that thousands, maybe hundreds of thousands of potentially high value applications remain undeveloped because those responsible for the operation do not realize that it is possible to have the kind of efficient, specific and valuable applications developed at a reasonable cost.
Development tools available today provide the means to easily produce an application in stages, creating an initial database, entry forms and limited reporting as users are loading data into master tables etc. While my experience with development tools other than Visual Dataflex (VDF) is limited, I’m aware that there are multiple paths to produce powerful applications for a reasonable cost. The applications can easily be multi-user at the start and grow as additional functionality is desired. No need to use an expensive database, although the tools in most cases can “talk” with multiple databases.
So, back to “Why a Custom Application?” I’d list the following:
- Custom Apps can be “lean and mean” – no “extras” needed
- “New” functionality is added only as desired, not at a designers whim
- “Lean and mean” is efficient
- More “understandable” for users – thus easier to use with fewer errors
- A well developed app can provide excellent return on investment
- “Off-the-shelf” software doesn’t really “fit” your needs or requires too much in the way of training, maintenance or “tweaking” for it to be of value
- Your industry doesn’t have any “canned” vertical market software available, or it is designed for too large/small an operation, is too costly, doesn’t “fit” the way you want/need it to.
You can probably think of many others as well, but I believe those above list covers the gambit.
Next Post — “Factors Against a Custom Application?”
Earlier this week I was speaking with a client for whom I had developed a custom solution for tracking and maintaining records of truck use mileages. The company has multiple locations scattered within a 200 mile radius or so, and crossing state lines. The project started out specifically as the result of an increasing difficulty in getting the individual logs and information back into the corporate headquarters in a timely manner to create the reporting required by the governing agencies. The existing process required significant time to assemble, organize and prepare reports from the existing logs (…which more often than not were not legible!). In fact, the person whose task it was to prepare the reports figured they spent somewhere between 8-12 hours preparing the reports each month.
From my developer perspective this was an excellent demonstration of need which could easily be developed. The goals desired from the program were to establish:
- Timely reporting
- Increased accuracy
- Painless Operation (read that as EASY) to use
To accomplish these goals a multi-user database application was created to collect log information on a daily basis — a key element of which was an EXTREMELY simplistic user interface for entering the log information by employees without typing skills. This was accomplished by extensive use of selection lists, calendar date pickers, combo forms and checkboxes – mostly it was odometer readings to be entered with the keyboard. (i.e. Start/Stop readings). Validation was built into the program to check the “reasonableness” of the entries. (i.e. an entry showing a truck doing 1000 miles in a day would not be accepted!).
As (I believe) with any application, the “real” value of the application has come with the variety of reporting now immediately available. The system is “real time” so a quick report run to the screen any time during the month can easily show “missing” or errant entries which can be followed up on while memories are still fresh.
As for the “value” of the application (…which by the way has been running for about 5 years now!), it’s difficult to quantify the accuracy gains, but certainly there is much more confidence in the current data (which looks quite different than previous data, probably because of inaccuracies with previous methods). As for the time spent preparing reporting – a savings of hours per month has added up over 5 years! Value of information availability again is hard to quantify, however as a sidebar, using the log has enabled better tracking of preventive maintainance for vehicles — certainly a benefit!
I love hearing reports like this where the value of an application shows such clear value! (Especially when it’s a piece of my work!)
It’s been a while since I’ve had to work with a detailed application specification, for which I am greatly thankful! Generally speaking, my “attitude” towards such specifications has not been one of grateful acceptance as I’ve seen too often how they become so rigid as to “get in the way” of being a truly useful means of communicating the “real” needs of the application to be developed.
This morning Bob Lewis of IT Catalysts once again caught my attention in his newsletter “Keep the joint Running“. I’ve often mentioned his writing in this blog, and once again I am moved to comment here. It was a simple paragraph within his writing that prompted me to immediately say “Yes!”, “Right On!”
He stated that “When developing software, or when designing business change even more, adherence to specifications isn’t the goal. It can’t be, because each set of specifications is used only once, is open to interpretation besides, and usually turns out to be the result of incomplete thinking...”. (Italics added). Bob’s statement quoted here is exactly the reason why I have such an “attitude” regarding most specifications.
My very first post in this blog “Please Hear What I ‘Really’ Need” was actually my way of stating what Bob says above. My post is very much the story of a project gone wrong with the “incomplete thinking” of which Bob mentions. Creating custom applications, specifying custom applications and designing custom applications is NO TRIVIAL TASK! Don’t try this at home! 🙂
My subject for this post – Reporting! (Perhaps you guessed! 🙂 )
A few days ago I posted about “Green IT” and reporting methods and considerations to utilize less “paper”. My post today regarding reporting is simply this — I posit that it is reporting which provides the greatest value of any business IT system. Given this position I further venture to say that in this economic climate there will be an ever increasing demand for reporting which previously was neither desired, requested or dreamed of! Just within my limited client base I have seen an increased interest in looking at new ways to report the wealth of information contained within client systems.
In addition, I have also seen that in some cases to get the information now being asked for will require additional input – (being of course that if data isn’t available it cannot be reported 🙂 ). This presents an interesting mix of work, application maintenance and report building – just what the doctor ordered for an otherwise “slow” time.
Through my years in IT and application development I have experienced many times the frustration of users regarding the presentation of their “data” in meaningful ways, or worse yet, their frustration knowing that data useful to them is simply “inaccessible” – “locked up” within the database somewhere. During times such as these management becomes more interested in the “finer points” of their operations – “What does this cost us? … How about that? …Can we look at …?”. During the busier times there seems to be a fear of having “too much” information, and surely “information overload” is easy to accomplish! But in these tight times the value of specific pin-pointed reporting cannot be overlooked.
Perhaps this is an appropriate time to be looking at what we can do in reporting to help our companies to succeed during this economic downturn. Wherever possible it seems appropriate that we application developers “suggest” reporting that will add value. Heck, it might also mean some job security 🙂
This probably isn’t the time, however, to suggest starting on implementing a new BI system, although wouldn’t that be nice?
This morning I overheard GMA making reference to the case of Julie Amero which was the subject of my blog in December titled “Unsavory Justice – Julie Amero vs Connecticut“. The ABC News story “Wrong Computer Click Ruined My Life” doesn’t present any new facts in the case, but does present the case before perhaps a much different audience than it might have received previously. The aforementioned links provide the background which I will not expand upon.
However, what I do want to suggest is that there are lessons to be learned from this:
- The need for training (…over and over again I see training as the number one need!) Those of us who are computer literate make many false assumptions regarding users — we need to rid ourselves of those assumptions. I’ve always told users that I was training that “You’re trainable”, but that doesn’t mean they’re trained! The training MUST be on-going.
- The IT department perhaps was not well enough trained either — OR — the proper precautions that would have prevented the site accesses would have been in place. (NOTE that this does make the assumption that IF they knew what to do they would have done it – which assumption could be erroneous since there is a cost associated with doing the “right” thing!).
- Leaving a system unattended can lead to mischief or mishaps – whether it’s a school classroom or a business environment, whether it’s a “dumb” use, or an intentional misuse. If you leave a computer that you’re responsible for, log off!
I’m sure that there are other lessons to gain from this as well, and certainly a variety of actions that can be taken. I just couldn’t let this sit today, even though I’ve blogged on it before! The case is unsettling to me!