You have 2 separate parts of this question.
1) you require to run dsppgm for nominated object(s) remotely and collect the data
2) you need to (automatically) repatriate these results
3) you need to report on the ‘latest’
Ooh – that’s 3. Spanish Inquisition time ( abstruse reference to a UK television programme ‘ Monty Python’s Flying Circus’ ) – hey – it’s Friday
Anyway, first write some CLP to accept a program name(s) and run DSPPGM to an *outfile .
ensure these work the way you need them to.
Depending on how you configure remote boxes the program object creation date may, or may not be enough – I would play safe and follow your guidelines in establishing the file levels used within the code so as to ensure you can create a unique ‘footprint’ for an object.
THis will result in a file. Arrange that it has a date/timestamp somewhere so that it itself can be uniquelky identified. You will of course keep the system ID within it.
2) – write a bit more CLP to save this file and FTP it to the home system. You must arrange that files have non-colliding ID’s when they arrive. Pay attention to security of FTP – maybe with a dedicated user ID which only has authority to the landing library/folder.
You could use scheduled jobs to run the above, or you can place a wrapper over them and call as needed, or telnet to each box – however you want to collect the information
3) You now have all the data on a home box.
Heck, even i could build and run a query, or SQL, or whatver to report over the content of the several files – joined by object name, and reporting on System ID and footprint.
Include you development box / library in the reporting so that your report can compare against ‘latest’