Parameter not accessible in MAP operation

Dear all,

I am new to Data Services and try to build job example in Data Services to do several operaitons:
As the customer do not like to invest into a scheduler except the build-in one or Windows scheduler all the parameters for extraction have to be calculated more or less on the fly when the job will be started.
I thought to have following design:
<JOB<
|
— WF
|
— WF — DF – ––––
|
— WF — DF – ––

––
|
— WF — DF – ––
––

The Idea is to combine the Dataflows for each source system together
for the extraction of each source system per subject.

AS these jobs might be extended and so there might be added additional data flows for each DF the extract date from has to be evaluated from last execution or to have a old date in the past to fetch all the data (if the DF never executed before).
But for this I need scripts and variables.
I build one script that writes these info to one table per job/WF to easily read the data from the last/actual(restart) run.

Now the issue: If I define parameters per DF and assign the values from Script to Parameters the big issue is that I can’t access the Parameter in the MAP operation. There I like to set for INSERT the column CREATE_BATCHLOAD_ID with the actual BATCH_LOAD_ID
and for UPDATE the column UPDATE_BATCH_LOAD_ID with the actual BATCH_LOAD_ID.

In the query it works to access for each workflow the DATE_EXTRACT_FROM/TO.

Sure I could create a BATCH_LOAD_ID global variable. But I would need one for each job as it is considered to be individual for each job. Even if it would be the same for all target tables it would be somehow a strange thing an d inconsistant.

In fact I wonder as the SAP trainer told it is best practise to have a whole job flow to do STG-Extract, Transformation, Fact Laoding within one job for a whole Star schema.

I work since 20 years in DWH area and worked with Informatica, Data Stage, Oracle ODI and Oracle Warehosue Builder. ALl tools were to some extent logical to use.

In SAP Data Services I don’t get the point as on one hand people write a lot of reuse of comonents (which don’t believe). Like the use-case you have dimension in different stars, then with an averge schedule you jsut schedule the dimension laod onece. There is no need for code reuse except the data source and data definitions and maybe few custome functions.

On the other hand basic transformations are poorly build (like the history preserve Transformation - can’t handle timestamps and intraday historysation). Also it is lot told about Catching errors. But it is a mess to clean up if one job do not work as one transaction and if the job fails you have to clean up your garbage).


manfred71 :austria: (BOB member since 2017-01-23)

I now give the answer myself.

it is possible to assign the DataFlwo parameter in the MAP operation.
Just SAP did not implement a selection in the dialog.
It fits well to the whole package ;).


manfred71 :austria: (BOB member since 2017-01-23)