Data Extraction control and checks

  • Good day all

    Could anyone advise on the best way to manage financial data extraction ? The problems I face are as follows:

    1. do a daily extract from a legacy mainframe system - works fine.

    2. check account balances and whether a financial transaction has an earlier transaction date than posting date - done.

    3. if there are transactions with an earlier transaction date then these records need to be extracted as an exception file - ongoing headache! So far I have been able to set a status for these records but am struggling with the logic of creating the exception file (especially as the extractions for that transaction date need to be rerun).

    Any advice would be much appreciated

  • I'd build a DTS package to extract these. Is this SQL 2000? If so, the dyanamic properties task can be used to easily find the date from a table or parameter and set that as the date used for the extract.

    Not trivial, but not too hard.

    Does this need to run shceduled or on demand?

    Steve Jones

    sjones@sqlservercentral.com

    http://qa.sqlservercentral.com/columnists/sjones

  • The server is SQL 2000 and the process needs to populate a daily extraction table, cleanse and check data, copy the checked data to a monthly table, call an SP which will do the daily,weekly or monthly bcp extract and finally copy the bcp out file to an ftp server. This will run on a daily schedule....

    I need to record the extraction, client etc and rerun any completed extractions for the dates, clients etc contained in the exception file.

  • Can anyone advise me on creating a DTS package as per Steve's reply?

    This is probably an easy one, except that I am seriously rusty due to having been 'pure production dba' for a tad too long!

    Any help would be much appreciated

    Best regards

    Charl

  • Are you extracting to a staging table? Some perm table that you use to hold the data prior to posting.

    Once in here, I'd use T-SQL to get the records in the state you want. The exception file, I would think, is an extract of all rows in a particular status. If that's the case, this ought to be an easy transform data task. Might change the name of the exception file using the dynamic properties task to match the current date_time.

    From here, I'd then move these rows into another status, exception output status. Then try to rerun.

    If you separate teh extract from the host from the other stpes, you could rerun the other steps until there are no new posts or exceptions.

    Steve Jones

    sjones@sqlservercentral.com

    http://qa.sqlservercentral.com/columnists/sjones

  • The data is pulled from 3 different DBs and placed into a staging table, which then has balance and exception checks run. There is a status column which I set for exception records (where the posting date is later than the transaction date).

    I'm using TSQL to run the checks, but I need to create an audit trail for the normal versus exception extractions, which I then will use to re-issue any completed extractions for the affected client.

    Because each account has an opening and closing balance the client requires a re-issue which includes the exceptions - which can occur anytime up to 3 months hence!

    Would it be advisable to set a batch number up?

    Thanks for the assistance Steve

  • on the subject of extraction -- It is a sure thing to use a stage table . Then, ADD views to pre-parse your data including the exception file. Now use a trick by inserting a row to a "trick -table" and the INSERT trigger fires a stored procedure for the final extractions. On the subject of DTL tools, look up dbArtisan they have a real nice job scheduler

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply