In Data Integrator how to schedule jobs depending on the successful completion of the previous job?
hemantkagale (BOB member since 2004-07-08)
In Data Integrator how to schedule jobs depending on the successful completion of the previous job?
hemantkagale (BOB member since 2004-07-08)
I’m not aware of a way to do that in the scheduler, although it may exist.
We accomplish the same thing by creating workflows that encompass each of our jobs. We have a master job that executes these workflows bracketed by Try/Catch functions. Each workflow will execute until an error occurs, then the remaining workflows will be skipped. The advantage of this is that DI allows you to restart from the point of failure once you have fixed the problem.
If your not already doing so, it is good practice to use Try/Catch to trap errors throughout your ETL process and raise exceptions so that all dependent processing is halted. Our catch functions send an email message if failure occurs so we instantly know the point of failure.
planetbob (BOB member since 2003-09-16)
Hi Bob
Thx a lot for ur suggestion. The situation for us is such that we can’t change/manipulate the JOBS. But hv to find a way-out in actually forcing the dependencies on the JOBS by an event scheduler script.
We’re actually thinking of writing a batch script which executes/controls the DI JOBS according to our algorithm of dependencies given. But the problem here is we don’t know where to refer the JOB COMPLETION details (including the Workflows/Dataflows). Your inputs would be helpful.
Rgds
Hemant
hemantkagale (BOB member since 2004-07-08)