Conversion Project

pts.
Tags:
Application development
AS/400
Data analysis
DB2 Universal Database
We are currently working on a project which will at a later date will require alof of data conversion on our existing system to fit the new file structures which are currently been developed. The project is been developed using COBOL on an iSeries, the old and new files will all be physical files. I was hoping if someone out there could give me a few pointers on how to approach a conversion project. I am aware that all conversion projects are different but was hoping that there may be a book or website which somone has used before that they found useful and helpful. Most of the staff that will be working on this will be coming from a 'developement' background. Without going into too much detail we have had small conversion projects where we used a COBOL program to 'convert' the data. Perhaps there are better tools out there that someone could point me towards. Thanks in Advance, B

Answer Wiki

Thanks. We'll let you know when a new response is added.

As both your source and target data resides on the I-Series you shouldn’t have too much trouble.

We have experience of replicating data between i-Series and MS-SQL using a product called Scribe Insight.
This allows you to map source files and fields to target files and fields. This mapping process allows you to perform functions on source data to ensure it is in correct format for target file. You can also perform lookups on other files where you are changing data records or creating new ones using a mixture of old records.

Discuss This Question: 4  Replies

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when members answer or reply to this question.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
  • Bvwatson
    I don't know how far along you are, but I've found it *crucial* to complete some preliminary exercises before building any data conversion code (whether buy or build): (1) Build a conceptual data model, defining the data that is stored in the source and target files. Look for any source data that does not have a target -- this data will be discarded, so make sure that's appropriate; if not, make a place for it in the target file. Look for any target data that does not have a source -- this data must be derived or constructed from somewhere. (2) Examine ALL of the source/legacy data, looking for anything that is unexpected. Don't depend on the documentation; don't depend on the legacy programmers. Write (or run, if you buy a tool) as many validations as you can think of, against every field in every file. Make sure that numbers don't contains non-numerics, that dates are in the correct format, that leading spaces are found, that encoded fields have valid codes, etc. Also make sure that the data relationships between 2 files are all valid. (3) As an outcome of the validation step, build an inventory of data cleansing and conversion algorithms that you'll need. For example, if some source character fields contain leading spaces (and you want these left-justified in the target), make a note that you'll need a "left trim" function. If numerics are "free-form" (optional decimal points and signs, leading/trailing spaces) and you want real numerics in the target, you'll need a "number alignment" function. Check all other oddities in the source in the same way. (4) Decide what you will do with invalid or unconverted data and make sure that's okay with the business. One (easy) option is to throw it away. Another is to capture bad data in a "pending" file that can be corrected and fed back into the same conversion. Make sure that your conversion tool/code can do incremental conversion; otherwise, one piece of bad data can make you start the whole file/process over again. (5) Decide how you will verify the conversion. Can you compare the old and new data? Maybe take a hash/sum/CRC of relevant data in the old and new files and match them up. Make sure you have a strategy for knowing that the conversion worked properly. Having gone through these preliminaries, you'll be ready to either evaluate a tool or design/construct your own functions for data cleansing and conversion. There aren't any limitations to doing this in COBOL (i.e., you don't need special C or Java facilities), so that's not an obstacle at all. Data conversion is a pretty thankless job, so the more care you take setting it up, the less time you'll spend hacking your way around "surprises". Good luck
    0 pointsBadges:
    report
  • Solutions1
    Is this a "big bang" one-time conversion? Or are there phases? Especially if the latter, you might look at ETL (extract, transform, load) software. Also, data conversion are risky and productivity-impacting for the organization as a whole (and perhaps for customers, suppliers, etc.) Therefore, publish a dictionary/synonym results table reflecting row-by-row changes to the actual data so that people outside the project can see what is happening to the data base and if needed do conversions in their own environments. Make the feedback table accessible via search.
    0 pointsBadges:
    report
  • Growler63
    If the data is integral to your enterprise, then you might think about starting something a little more generic and extensible. I have written 3 'one-shot' conversion applications that are still in in use (one has been in operation for 9 years now). I have modified, refactored and redesigned it several times. So much for 'one-shot'. All of my conversion utilities now follow a common design with three pluggable modules: input parser, output formatter and transformer. The original input parser was written in C++, then python and now Java antlr. The transformer is a rule based template engine and the output is a common templating system. If you are starting today from scratch, XSLT is a viable solution. The input does not have to be XML, you just write the processing code in a SAX compliant manner and feed that to saxon or xalan. XSLT is an exceptionally powerful and productive transformation system, although it does take a little getting used to. The one thing I would strongly advise against is trying to do the conversion in one monolithic program. It will be harder to implement, understand and modify even while you are developin it and it is fresh in your mind. The Parse-Transform-Serialize model is appropriate for anthing that is more complexe than a one line perl regex transformation.
    0 pointsBadges:
    report
  • Pjhugconnectdataencoding111
    im interested to this work.. please send me the details at julitoortigas at live dot com
    10 pointsBadges:
    report

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

To follow this tag...

There was an error processing your information. Please try again later.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Thanks! We'll email you when relevant content is added and updated.

Following