hi,
I want to capture metadata for the batch job execution, such as No of rows inserted, no of rows fetched, job_start time, end time, job status, exceptions etc at datflow level and job level.
I want to know whether the process that I am following is correct or is there a better way to do that. I have attached a .doc file with the screenshot(commented).
Description of process:
-
To capture the exceptions at datflow level I hae put the dataflow inside a Try-Catch block, in the catch block, I use a script to update a table that stores the exception details.
-
After a Query Transform that maps source columns to target, I use a query transform to get the no of rows fetched from source by using the row number and taking the max of that, and I update a table called Batch Control, which stores rows fetched, rows inserted ( for this I use, simple math calculation).
-
To capture the number of rows rejected/discarded for the target table, I use a case transform that checks whether the Primary Columns are null, if so I send those records to a table called rejected rows.
Please take a look at Screen Shots in attachments.
Question:
Is the process I am following correct/standard.
Please Guide.
Thanks.
lakhan (BOB member since 2011-05-25)