Multiple submission of a JOB

1,420 pts.
Tags:
AS/400
AS/400 jobs
CL Program
RPG Program
Hi every one, Requirement 1.a : I have a CL program (ISC) which reads/process 10 Million records from an input file IF with some conditions and writes to a Output file OF1 with the help a RPG program R1. Now my requirement is to improve the performance of this read process by running/submitting multiple jobs using the same CL program. so i need to run/submit 10 similar jobs of the same CL with names like ISC001,ISC00,....ICS010. Here First job ICS001 will read records from the input file 1- 1 Million, second job ICS002 will read records from the input file 1 Million - 2 Million,.... third job from 2 Million- 3 Million. Also now instead of writing to old Output file OF1 , now it should write to other Outfile OF2. Please note that the structure of OF1 and OF2 are same except that OF2 has an additional field Nway-number, which I need to populate based on the CL submit job ID. How to approach in this case so that I get the data into new Output file OF2 along with Nway Number. Any one explain the process of multiple submission of same Job (CL) with different Names for like ISC001,ISC00,....ICS010 each job and also hoe to restrict the data to each job accordingly? Requirement 1.b : I was told that the old Output file does not contains duplicates based on Product id field (Key fields are Product ID, Currency code) in the file, and will get the duplicates now because of Nway Processing. here the key fields are (Product id , Currency code and Nway-no). Now I also need to use a new RPG Program R2 which will just read the new Output file OF2 and populate to old Output file OF1. Can any one any one please share your thoughts and suggestions, so that I will proceed. Awaiting the reply..Thanks!

Software/Hardware used:
AS400, RPG 400,CLLE,
ASKED: August 26, 2013  1:29 AM
UPDATED: August 27, 2013  1:17 PM

Answer Wiki

Thanks. We'll let you know when a new response is added.

There are some things to consider when doing this.

1.Is the input file keyed. If it is, are you reading it by key. If so, you need to come up with some way of breaking the file into your 10 segments. It would be near impossible to get 10 equal segments.

2. I would also use the same Output file structure for all of the 10 output files. for the 1st pass you put in 01 for the Nway number.

3. You setup your CL to accept an input parm. Then call it with a parm of blank. This triggers your program to know it is the initial call.

4. In the CL you override to set the Input file to the correct RRN  using the POSITION parm.

5. You override the output file to the correct Output by concatenating the input parm to a variable to set the file name.  

6. You set a counter in the CL so you can have a loop and do 10 SBMJOBs in the program. I would use the variable you created for the output file as the Job Name in the SBMJOB.

7. Your RPG program has a counter to process only 1 million records then goes to EOJ.    NOTE: IF YOUR FILE HAS DELETED RECORDS, you will need to use the INFDS to determine EOF by checking the RRN.

=====

Now all that said, this may not reduce the overall run time by very much. Not know how your RPG program is setup or what it is actually doing we cannot give you advice on that. It may be Blocking your files or using SQL would help you more.

Discuss This Question: 1  Reply

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when members answer or reply to this question.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
  • TomLiotta

    I agree particularly with the last paragraph of the 'Answer'. Before making any changes, it should be determined if 10 jobs processing a million rows each will be any faster than one job processing 10 million records. It's certainly possible that they'll be faster if nothing else is going on; but they might actually take longer. They will definitely do more work, so taking longer is possible.

    They have to do 20 file OPENs instead of only two for example. They each need to do a search to find a starting position in input file IF, instead of simply starting at the beginning of the file. There will be potential contention for output file access. They each will now need to test current record number after every READ in order to tell if it's time to end or not.

    The actual problem needs to be defined before going to multiple jobs. First thing I'd look at is how the SEQONLY() attribute is set for both the IF and OF files. Other elements such as the memory pool available the job, etc., should also be handled according to whatever the requirements are.

    Tom

    125,585 pointsBadges:
    report

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

To follow this tag...

There was an error processing your information. Please try again later.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Thanks! We'll email you when relevant content is added and updated.

Following