I'm currently trying to prove a concept on why we should re-create our old existing PFs into SQL defined tables (Portability, performance, etc.).
After taking a look at some files, I noticed an even larger problem that would ruin my chance of proving this concept. Unless some major changes happen, we won't see any benefits from the change.
Here's my situation:
88 Million records with over 150 Logical Files over one Physical File.
Am I crazy? Or, is that just ridiculous?
Apparently, this isn't the only file. I've found at least 5 to 6 more with the same problem. I looked at some of our smaller files and they have an average of 20 - 30 logical files over that data.
They also don't seem to think this isn't a major problem and we shouldn't spend a lot of time on it.
So, what I'm asking for is opinions and ideas on how to prove to that spending the time (And it will be a LOT of time) to analyze our database and get rid of all these problems.
Here's my idea for an approach to each file:
Analyze all of the logical files and remove any not currently usedCheck all programs using the logical files and remove if no longer usedIdentify programs that should be re-written to eliminate logical filesRe-write said programsCreate history files to reduce the number of recordsConvert PFs and LFs to Tables and indexes
Just taking a quick look at the Indexes(LFs) using iSeries Nav, I found a lot of information. One thing I noticed in the "Last Build" column was they were last built in 2002. I'm guessing even if I just rebuild those logical files we would see a performance boost. But, what is considered rebuilding? Re-compiling?
I apologize for the lengthy post. I have a lot more questions. But, I just wanted to get a feel for some Ideas.
May 26, 2009 2:16 PM
September 20, 2010 2:29 PM