I’m using IDOCs as a source and loading data into sql database. When I’m trying to load data ,I’m getting 1 record multiple times in the target table but source IDOCs having multiple distinct records. I’m just mapping column and doing un-nesting it and using Query transform only .I do not understand where is the problem.
could you pls provide me any solution where Im wrong.
Example: I have the schema MATERIAL with the two subschemas MATERIAL_TEXT (5 rows) and MATERIAL_COLORS (10 rows). If you unnest both you actually join them, result will be all MATERIAL_TEXTs with all MATERIAL_COLORS makes 5*10 = 50 rows.
Now when I’m joining these three table into one table I’m using second data flow. But Our Real time job is not executing the second dataflow.
I tried these two approaches–
{ -Dataflow1->Dataflow2 }
{-Dataflow1 }->Dataflow2
I’m using second dataflow for to load data in another database table from previous three tables.
so please guide me, How may I load these data into database table.
Why a second dataflow? It is supported in realtime dataflows to have two dataflows but not the way you assume it.
And second, we did split the data into three queries because you can’t join the data but now you are joining them again??? Can you please show an example with data of what you have as input, what the expected output is?
My requirement is that-I want to load multiple data through IDOCs. As per mention in our previous block, I have 3 Schema in Idoc structure, these three schema have approx. 316 column like Schema1-100 col,Schema2-100 col and Schema 3-116 col. I need to load these three schema into single table.
So When I was trying to load directly using unnesting at the same level ,I was getting More Records(Cross Join).as per your guideline-now I loaded these schema into different -2 table. But my target is to make the single table form these three schema.
In my first step I load these three schema into three target table. Each schema have one primary key. Now I want to make one table from these three table.
For Example :-
Schema1->Table1----
Schema2->Table2----Join and make one—Master table
Schema3->Table3—
As per above discussion I makes 3 tables from each schema so I have required to Load these Tables data into single one.
I hope now you have found all requirement. So please if you have any other logic then pls share with me .
My requirement is load the data from multiple Schema to single table using Real Time Job.
When I tried to load all schema at one time , I was getting multiple records so after that I load the data into different table corresponding to Schema as you guide me.
Then I realize, I required primary key in each table so that I can join each table and load into single one. Then SAP team added four new column in each schema and I found one composite Key(combination of four column) in each Schema.
Now I want to Join these table into single table on the behalf of Composite key.
I have define below mention Join condition in where close in second Dataflow-
(Tab1.col1=tab2.col1 and tab1.col2=tab2.col2 and tab1.col3=tab2.col3 and tab1.col4 = tab2.col4)
The idea of a realtime flow is that you have a realtime message as input, the message flows through the dataflow and at the end an output message is sent back. As you see you cannot have any hard stop inbetween, if you use multiple dataflows the data still needs to flow through all. The only way you can do that is using the in-memory datastore. Its a confusing name, it means a cache between realtime dataflows to connect them.
Hence I am inclined to say you cannot use multiple dataflows, if you do it is a workaround of some kind and should be the exception.