Running batch jobs with 'dynamic' substitution parameters

We are currently running some batch jobs in 4.0 that need to have changes in the substitution parameters from run to run (some of the transforms only accept a substitution parameter, no variables). I was thinking of automatically generating the batch file with the set of substitution parameters about to be run but I am fairly sure that isn’t the best way to go about things. Are there any more friendly options at my disposal?


RyanRB (BOB member since 2014-07-01)

You’ve got 2 options here.

One is as you’ve described. Export the execution command and wrap a script around it to populate your variables.

The other is to use web services. Using the real time function run_batch_job, from a real time datastore or other means to send the soap request. (the suds module in Python is easy)

If you’re going to load it into an enterprise scheduler, Im going to have to say that using the export execution command is going to be simpler. I have a set of jobs that have been running for 5~ years now and I cant remember the last time I had problems with one.


jlynn73 :us: (BOB member since 2009-10-27)

I’m still new to the use of substitution parameters so I could be wrong, but here goes…

The substitution parameter values are not in the export execution command file. They are in the repository. At runtime the job server reads in the source code for the job and anywhere it finds the substitution parameter it substitutes in the value for it that was found in the repository. There is a term for this in C++ but it escapes me at the moment. Something about compile time, blah, blah, blah.

If you want to run an ETL job with different substitution parameter values for each call of the job then:

  1. Perhaps you are using substitution parameter values wrong. Maybe you should be using global variables instead.
  2. Change the substitution parameter value in the repository just before you run the ETL job and the job will pick up the current value.

eganjp :us: (BOB member since 2007-09-12)

A Substitution Parameter can be associated to a System Configuration. Just treat the Sub Parm as a Datastore with many config in it.

In my language they are called Global Constants 8)

Sub Parms are Repo Level Setting and not a job level setting.

Also, just curious to know which transform does accept only a Sub Parm?!


ganeshxp :us: (BOB member since 2008-07-17)

I was thinking of preprocessor directives: http://msdn.microsoft.com/en-us/library/3sxhs2ty.aspx


eganjp :us: (BOB member since 2007-09-12)

Wow you really chased it and found!!! Kudos!


ganeshxp :us: (BOB member since 2008-07-17)

It took me a while. I don’t do C++. Heck, I haven’t done any of the hard core programming languages in quite a few years.


eganjp :us: (BOB member since 2007-09-12)

Not quite the same thing because once a program has been compiled anything that is part of the preprocessor directives is hard-wired into the code and cannot be changed when the program is run. DS Substitution parameters are associated with the system configuration used to run the job. So the OP has the following choices:

  1. set up multiple system configs for each value of substitution parameter required. I think the system config. can then be passed as part of the command line when the job is run (I don’t run jobs outside the designer/management console interfaces)?
  2. find some way to modify the substitution parameters prior to running the job - the only possibility would seem to be an ATL file imported immediately before running the job?

I would also to be interested to know what transform only allows substitution parameters - I note some of the DQ/Cleansing transforms use them a lot.


dastocks (BOB member since 2006-12-11)

Thanks for the help.

As for this particular transform, this is an Address Cleanse transform which is (I guess) not an out of the box transform. Basically you feed it names and addresses and it standardizes them, flags movers, etc etc. Anyway, many of the parameters only accept a Substitution Parameter.


RyanRB (BOB member since 2014-07-01)

dynamic job scheduling is more fun. :twisted:

If you havent built your NCOA automation yet, beware that the USPS has just recently began checking the logs you’re required to provide them. More specifically theyre being real picky about the list dates. So if you populate your list return date with the current date, then it rolls over midnight as its processing … that will generate a correction email from the post office when you submit your logs.


jlynn73 :us: (BOB member since 2009-10-27)

Thanks for this and nice to see this here. https://www.exams9.com/hcl-trade-apprentice-jobs-2019-100-vacancies/2457


rickone (BOB member since 2019-12-28)

I would place values within a DB Table and start your batch job setting global variables values reading from it