So I’m looking into something a bit tricky, I do not know if it is doable or not.
We have a production database and copy database, that tends to be unavailable every now and then. Preferrably we want to use the Copy DB so we do not impact Prod, but if Copy is unavailable the job should get data from PROD.
So what I would like to do (or to know if it is possible) is the following :
1 - Start my job “My_Job”
2 - Script object
test Copy DB
-> If available -> Config = Copy DB
-> Else -> Config = PROD DB
I don’t mind if I need a second datastore or two configurations inside a datastore as long as there’s a way to do this.
Of course I thought of duplicating every DF and change the source to Copy/PROD and having a condition at the beginning but we have too many jobs and too many objects to modify.
Dataflow - DF_CHECK_ENV:
Extract data from any of the Small Table from your Copy database into your staging DB.
CATCH: If Copy data base is down, the extraction will fail from Step 3 and the flow will go the Catch Part
Use a script in Catch block - SET_ENV(‘COPY_AVBL’,‘NO’);
CONDITION:
Check the Value of Env
If ( get_Env(COPY_AVBL) = ‘YES’)
Place the Actual Extraction in the Dataflow - DF_COPY from COPY Database
if the Value is NO,
Place the Actual Extraction in the Dataflow - DF_PROD from PROD Database
Please refer to the reference guide for proper syntax of the functions:
Hope it Helps!!!