Performance issue with reading of AS/400 primary file

70 pts.
Tags:
AS/400
RPGLE
My program reads primary file (which is join file basically) and generates report. This program runs in batch mode. Program was doing pretty well all these day. But from few month report generation takes more time. Note that no changes has been done in the program from past few years.

Answer Wiki

Thanks. We'll let you know when a new response is added.

It may be a number of things. Do a DSPFD and see if there are a lot of deleted records. If so you may want to run a RGZPFM to cut down the size of the indexes. Yes the old deleted data is still there, and may be recoverable, until to run the RGZPFM. If this is a common issue you can change the file parameter to reuse deleted records. Second may be to add a KEY to the file if one does not exist. Then the program may need to be modified to SETLL on the starting date to speed things up.

Discuss This Question: 7  Replies

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when members answer or reply to this question.
  • pdraebel
    I would take a look at your data files that get joined. Do they have lots of deleted records? A reorganize of the file(s) may restore your performance to prior levels.
    7,545 pointsBadges:
    report
  • WoodEngineer
    Double check your join. It is possible that one record from your primary file is being joined with multiple records from the secondary file.
    7,650 pointsBadges:
    report
  • ToddN2000
    As WoodEngineer mentioned, check the join. You may need to fine to the join conditions and add an additional field to avoid picking up more that one matching record. The other thing to check is like we mentioned about the deleted records, check all of the files used in the join for them. You may need to run the RGZPFM on all the files involved.
    89,355 pointsBadges:
    report
  • azohawk
    Check the index advisor (in Navigator for i) to see if there are some recommended indexes to create on one or more of these files.
    2,875 pointsBadges:
    report
  • pdraebel
    Reorganizing the based on files for the Access Path(s) used could also prove beneficial. As the main advantage of Input Primary is reading blocks of data rather than record per record.
    7,545 pointsBadges:
    report
  • WoodEngineer
    It would be helpful to know now many active and deleted records are in the primary file.  While deleted records will increase processing time it would take a lot of of them to have a significant impact on the job.
    How long did the job run a few months ago compared to today?
    7,650 pointsBadges:
    report
  • ToddN2000
    Another possibility is if this job is a resource hog it may be getting changed by operations. Sometime they will change the run priority on the fly so other jobs can finish. It may also be that the CL was changed to submit the job at a different priority or use a new JOBD.
    89,355 pointsBadges:
    report

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

To follow this tag...

There was an error processing your information. Please try again later.

Thanks! We'll email you when relevant content is added and updated.

Following

Share this item with your network: