try/catch not working with data conversion error?

I have a BODS job that loads data from a csv file into an integer column in a DB2 table. Usually the csv data is reliably clean, so my BODS job does not pre-validate the csv file data. Instead, in order to be notified in the unlikely case that the input data is bad I just put a try/catch around the workflow to send an email if any error occurs loading the data. I tried testing that today by purposely giving it bad data, and was surprised to find that it isn’t catching the data conversion error. Does anybody know why that could happen? Are there some errors that aren’t catchable? In my catch declaration every exception type is checked. The data conversion error looks like this:
(14.0) 02-03-14 16:35:49 (E) (7868:3652) RUN-050802: |Data flow DRT_DF_LoadInputData|Reader DRTLAB_INPUT_FMT1
Cannot convert data <9-999-999> into type . Context: Column .


voldal (BOB member since 2013-05-13)

Most likely because the data conversion is a warning, not an exception. Try/Catch only works with exceptions.


eganjp :us: (BOB member since 2007-09-12)

That doesn’t seem to be the case, because the BODS job doesn’t get past the attempt to load the .csv file. That data conversion error is the only thing in the BODS error log. The BODS trace log shows that dataflow starts, but doesn’t show the dataflow ending. In the Data Services Management Console the job status is the big red X. That means the job failed right? I see the End time in the console is blank, even though this job failed many hours ago.


voldal (BOB member since 2013-05-13)

Are you using Bulk Loader?


eganjp :us: (BOB member since 2007-09-12)

No. My data flow just uses a flat file format as source for a Query transform whose destination is a DB2 table.


voldal (BOB member since 2013-05-13)

In the Dataflow, open the Flat File source object. What settings do you have for “Maximum warnings to log” and “Maximum errors to stop job”?


eganjp :us: (BOB member since 2007-09-12)

They are both {no limit}. Other relevant looking settings under Error Handling are: log data conversion warnings Yes, log row format warnings Yes, capture data conversion errors Yes, capture row format errors Yes.


voldal (BOB member since 2013-05-13)

Change “Maximum errors to stop job” to zero or one. I don’t remember which one will work.


eganjp :us: (BOB member since 2007-09-12)

Thanks!!! The value of 1 does what I want. On the first bad record my catch block executes.


voldal (BOB member since 2013-05-13)

Hi Can you please guide me on how to capture this by using try & catch…specially what code should be written


Parijatam (BOB member since 2016-09-08)

Also why does the Error functionality of capturing data conversion error not work as you are using File itself.


Parijatam (BOB member since 2016-09-08)

The Try/Catch block is actually capturing the exception, its just not one of the exceptions to be handled. This is why when a job which normally crashes on a file format error, returns a 0 return code even though it produced an error.

This wreaks havoc on our job streams built in our 3rd party scheduler. The next job in the job stream gets fed a 0 return code from this job, and continues merrily along with the next set of jobs. :hb:


jlynn73 :us: (BOB member since 2009-10-27)

Thanks for you reply…can you please give a scenario or example to make it more clear…or if you have any PDF/document please send me on d.parijatam@gmail.com


Parijatam (BOB member since 2016-09-08)