Data Conversion on IBM AS/400

Tags:
COBOL
Data analysis
DataCenter
Development
Disaster Recovery
ERP
Financial services applications
Help Desk
Incident response
Lifecycle development
Manufacturing applications
Microsoft Office applications
Programming Languages
RPG
SQL
Storage
SunGard Data Systems
Systems Union Group
Transaction Systems Architects
Hi, We're converting some old master files in to new format files. These master files have around 58 million records each. We wrote COBOL programs to convert the format and write to the new files. In order to speed up the process we are splitting the each master file into 20 parts(i.e., 20 files and spreading the records in to fileslike 1 to 20000 to FILE1, 20001 to 40000 to FILE2, etc.) and running the same program to populate the records in the new format in the NEWFILE, in multiple jobs, after the necessary overrides. It is taking about 5 hours to complete all these jobs, even after using 85% of the CPU. I would like to further speed up the process. Can someone help/guide me how I can improve the performance? What is the best approach for dealing with such conversions? We need to convert 4 such files. Any help is appreciated. Thanks, Vasanth

Answer Wiki

Thanks. We'll let you know when a new response is added.

Is the physical file keyed? Are there any logical files built over the new physical? If so, you should consider temporarily removing the logicals. If the application isn’t keying the physical as it runs, it should run much faster….

Discuss This Question: 9  Replies

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when members answer or reply to this question.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
  • Solutions1
    Have you looked at what the pacing resources are? Is the conversion more I/O bound or compute bound? If I/O bound, can you address caching issues, get more spindles, etc. If compute bound, are are particular conversion steps that are slow that could be optimized - e.g., are you doing string operations when COBOL refefines might do? If you sort the files, can you reuse data from the preceding record - e.g., if a given product or customer shows up again and again. Is it necessary (and advantageous) to use the AS 400 - e.g., could you move the conversion to PCs in some approximation of "grid" computing?
    0 pointsBadges:
    report
  • Davega
    What blocking factor have you specified on the read and write. The blocking factor can affect the run time. You want to block with the largest block size possible on both the read and write. If you have logical files over the output file remove the member and add after processing is complete. Ensure the target file is not journaled. Cheers, Dave
    0 pointsBadges:
    report
  • RolandT
    Might be obvious, but make sure that the output files are not being journaled, remove trigger programs from the output files, remove logical file members on all logicals over the output files, and submit the the job to batch. Also increase the time slice the default is 5000 and make sure that the purge is set to *No.
    0 pointsBadges:
    report
  • Jspera2525
    You can speed up the reading the records process by not reading by key.
    0 pointsBadges:
    report
  • Inoboa
    Remove all keys from the new files, but be sure that the input files are sorted in the first key order (to avoid page splitting in memory when you build the key). Have you tried to increase the I/O buffers for this operation only?, it can help you.
    0 pointsBadges:
    report
  • Ddomic
    Before running your programs consider to perform STRASPBAL ASP(*ALL) TYPE(*CAPACITY) TIMLMT(*NOMAX) and STRDSKRGZ TIMLMT(*NOMAX) ASP(*ALL) and wait until they end. Then run your programs. If you have only few disk arms do not run all of your programs concurrently !!!
    0 pointsBadges:
    report
  • EXPERTJohnBrandt
    The first thing to consider is the files that are the destination. The destination files should not be keyed, nor have any triggers. If possible, don't have logicals over them, although this will increase the time required to build the logicals afterward. The second thing to consider is to override the files to use record blocking. The third thing to consider is to read and write without keys. A typical over ride command to do the override would look like this: OVRDBF FILE(A) TOFILE(B) NBRRCDS(32767) SECURE(*YES) Then call your COBOL program. This will also be posted in the Ask The Experts area of Search400.com
    2,520 pointsBadges:
    report
  • Jspera2525
    Check this out.If you have millions of records chaining out to some master file like customer master to get current data you can speed this process by loading the customer master file to a User Space. Get information from the User Space is much,much quicker than a chain to a file. However access the User Space will be little tricky.You can use the customer number as the address in the User Space to get the data, but the if the master file is product number with Alpha chars you make want to create a small index file to stored the address for the User Space for the given product.
    0 pointsBadges:
    report
  • EXPERTJohnBrandt
    User Space is limited to 16 MB and with Terraspace, it's a nightmare. Even with User space, you still have to get it in the user space, which is why an HLL is the only real answer.
    2,520 pointsBadges:
    report

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

To follow this tag...

There was an error processing your information. Please try again later.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Thanks! We'll email you when relevant content is added and updated.

Following