How to fetch multiple records from single IDOC

Hi Guru’s

I’m using IDOCs as a source and loading data into sql database. When I’m trying to load data ,I’m getting 1 record multiple times in the target table but source IDOCs having multiple distinct records. I’m just mapping column and doing un-nesting it and using Query transform only .I do not understand where is the problem. :frowning:

could you pls provide me any solution where I’m wrong.

I would be appreciate your response!!!


rajeev_khokhar1 :india: (BOB member since 2011-01-19)

Do you unnest two schema of same level?

Example: I have the schema MATERIAL with the two subschemas MATERIAL_TEXT (5 rows) and MATERIAL_COLORS (10 rows). If you unnest both you actually join them, result will be all MATERIAL_TEXTs with all MATERIAL_COLORS makes 5*10 = 50 rows.


Werner Daehn :de: (BOB member since 2004-12-17)

Hi Werner Daehn,

                       Yes we are pretty much unnesting mutiple schemas at the same time. Can you please let us know how we have to unnest multiple schemas.

Thanks very much for your reply!!!


rajeev_khokhar1 :india: (BOB member since 2011-01-19)

Think about my MATERIAL example, what do you want to accomplish? One COLOR, all TEXTs? First COLOR with first TEXT?


Werner Daehn :de: (BOB member since 2004-12-17)

Thanks Werner for your Quick Responce!!

Can I complete it into a single Dataflow or we need to go three different Dataflow

for more information pls find the attached IDOC structure Screen short…

IDOC structure is like this-
Main_First_Table
Main_Table
------------Table1
------------Table2
------------Table3
IDOC_Structure.png


rajeev_khokhar1 :india: (BOB member since 2011-01-19)

So you want to have three target tables, one per information (schema)? That would be the IDOC connected to three queries, each using this technique

https://wiki.sdn.sap.com/wiki/display/BOBJ/Separating+a+NRDM+node


Werner Daehn :de: (BOB member since 2004-12-17)

Thanks Warner!!!
Really very good information.

Now when I’m joining these three table into one table I’m using second data flow. But Our Real time job is not executing the second dataflow.
I tried these two approaches–

  1. { -Dataflow1->Dataflow2 }
  2. {-Dataflow1 }->Dataflow2

I’m using second dataflow for to load data in another database table from previous three tables.

so please guide me, How may I load these data into database table.


rajeev_khokhar1 :india: (BOB member since 2011-01-19)

Why a second dataflow? It is supported in realtime dataflows to have two dataflows but not the way you assume it.

And second, we did split the data into three queries because you can’t join the data but now you are joining them again??? Can you please show an example with data of what you have as input, what the expected output is?


Werner Daehn :de: (BOB member since 2004-12-17)

Thanks’ Werner for reply!!

My requirement is that-I want to load multiple data through IDOCs. As per mention in our previous block, I have 3 Schema in Idoc structure, these three schema have approx. 316 column like Schema1-100 col,Schema2-100 col and Schema 3-116 col. I need to load these three schema into single table.

So When I was trying to load directly using unnesting at the same level ,I was getting More Records(Cross Join).as per your guideline-now I loaded these schema into different -2 table. But my target is to make the single table form these three schema.

In my first step I load these three schema into three target table. Each schema have one primary key. Now I want to make one table from these three table.

For Example :-

Schema1->Table1----
Schema2->Table2----Join and make one—Master table
Schema3->Table3—

As per above discussion I makes 3 tables from each schema so I have required to Load these Tables data into single one.

I hope now you have found all requirement. So please if you have any other logic then pls share with me .

I Shall be appreciate your response!! 8)


rajeev_khokhar1 :india: (BOB member since 2011-01-19)

What is the primary key column of each of the three schemes? What is the PK of the target table? What are the join conditions?

Probably that’s what I am not getting…


Werner Daehn :de: (BOB member since 2004-12-17)

Thanks’ Werner for reply!!!

My requirement is load the data from multiple Schema to single table using Real Time Job.
When I tried to load all schema at one time , I was getting multiple records so after that I load the data into different table corresponding to Schema as you guide me.
Then I realize, I required primary key in each table so that I can join each table and load into single one. Then SAP team added four new column in each schema and I found one composite Key(combination of four column) in each Schema.
Now I want to Join these table into single table on the behalf of Composite key.
I have define below mention Join condition in where close in second Dataflow-
(Tab1.col1=tab2.col1 and tab1.col2=tab2.col2 and tab1.col3=tab2.col3 and tab1.col4 = tab2.col4)

Now My ETL structure is like this-

{

Dataflow1

IDOC-|Schema1—>Query1(unnest schema1)---->Table1
Schema2—>Query2(unnest schema2)---->Table2
Schema3—>Query3(unnest schema3)----->Table3

End of Dataflow1

}
Dataflow2

Table1-------->
Table2---------> Join --------> Master Table
Table3--------->

End of Dataflow2

But when I’m adding second dataflow ,my job do not execute that Dataflow.

If I’m wrong at any point or I have required to change my Logic then please guide me!!!


rajeev_khokhar1 :india: (BOB member since 2011-01-19)

And now instead of having query - table and table - join_query you do all in one dataflow

query1
query2 – join_query
query3


Werner Daehn :de: (BOB member since 2004-12-17)

Thanks Werner, Really Appreciate your Response!!! :wave:

I have found right result in single Dataflow!!!

I have one more Question now-
Can we use second dataflow in Real Time Job? :?:
if yes then how? :crazy_face:

I know you are the sea of knowledge, so kindly advise me!!!


rajeev_khokhar1 :india: (BOB member since 2011-01-19)

The idea of a realtime flow is that you have a realtime message as input, the message flows through the dataflow and at the end an output message is sent back. As you see you cannot have any hard stop inbetween, if you use multiple dataflows the data still needs to flow through all. The only way you can do that is using the in-memory datastore. Its a confusing name, it means a cache between realtime dataflows to connect them.

Hence I am inclined to say you cannot use multiple dataflows, if you do it is a workaround of some kind and should be the exception.


Werner Daehn :de: (BOB member since 2004-12-17)