AS/400 INFDS

250 pts.
Tags:
AS 400
INFDS
iSeries
By using INFDS, I do not want to write duplicate record into PF. What is the requirements for that?

Software/Hardware used:
i-series

Answer Wiki

Thanks. We'll let you know when a new response is added.

The easiest way to eliminate duplicate records is to put a UNIQUE key on the file.
Is there some reason you cannot do that?

Discuss This Question: 7  Replies

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when members answer or reply to this question.
  • The most-watched IT questions: April 2, 2012 - ITKE Community Blog
    [...] 7. Help Nikraipapa find a solution to finding the requirements for eliminating duplicate records. [...]
    0 pointsBadges:
    report
  • anandx
    case #1. when file-level keyword is: 'UNIQUE'

    WRITE record
    if exception-error code is = 01011 / %error (rec already exists), 
    skip writing

    case #2. when file-level keyword is NOT: 'UNIQUE'

    opt #1:
    Chain record
    if exception-error code is = 01012 / %found (no rec found) 
       then write record
    else don't write record

    opt #2:
    go ahead and write record
    if exception-error code is not 00000 / not %error (no exception/error occurred)
    we are lucky, everything is fine
    else (%error)
    don't write

    715 pointsBadges:
    report
  • anandx
    Correction on correction:
    Chain record
    if exception-error code is = 01012 / NOT %found (no rec found) 
    715 pointsBadges:
    report
  • Splat
    It would be simple enough to use an index, either on the physical file (not something I recommend) or through a logical file, to determine the existence of duplicates.
    12,420 pointsBadges:
    report
  • anandx
    I agree that indexing is the best solution. However, Nikrajpapa, the author of this question, doesn't explicitly indicate whether this file is a new file, or an existing old file which is already been used in an application or just that it's a mere academic question. Regardless, I was thinking what if the file is already under circulation in an application and quite a few programs have already been written and in use allowing duplicate keys and here comes a new bizarre situation where a one-time pgm to be written (and discarded after a one-time use) that should write only unique records to an otherwise non-unique index file. Should we go change the existing file and thus change all existing pgms that access that file or create a new LF with a unique key temporarily, use it and destroy that after a one-time use or don't change the existing files and pgms at all, but write a one-time pgm that takes care of avoiding duplicate keys and be done with the pgm immediately after.
    715 pointsBadges:
    report
  • GregManzo
    Either changing the file to UNIQUE or adding a new UNIQUE logical will be "problematic" if there are other programs that will try to write duplicate records because they will start crashing. Also, if there are pre-existing duplicate keys the attempt to create a UNIQUE logical will fail anyway (we can trust this OS guys). And if you can't add a UNIQUE key then the write-and-trap-error approach wont fail either.
    Unique key or no, I would recommend using a CHAIN or SETLL to test for the key before attempting the write.
    2,475 pointsBadges:
    report
  • anandx
    Agreed. I should've thought twice before committing, realized the mistake and rolled back, sorry for the rush, apologize, thanks for pointing out.
    715 pointsBadges:
    report

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

To follow this tag...

There was an error processing your information. Please try again later.

Thanks! We'll email you when relevant content is added and updated.

Following

Share this item with your network: