BusinessObjects Board

RFC read table vs RFC streaming

Hi

I am using DS 4.1 and in a Dataflow i am directly dragging SAP table. would like to make sure that RFC streaming (in BODS 4.1) is enabled and NOT rfc read table is invoked.

How can i check this ?


learndi_2011 (BOB member since 2011-03-03)

RFC Read Table exists from very old days. However, RFC Streaming is still an ABAP Dataflow. Most BASIS (or I suppose SAP default setting) has that RFC limit of 10 minutes which is basically the RFC Read Table approach.

Talking about RFC Streaming, it is bacially an ABAP program. In our conventional ABAP Dataflow where the Data Transfer method is Shared Directory/FTP, is a 2 step process

  1. ABAP Program runs and create a output DAT File (Until when DS proecess is just idle and waiting for the ABAP to complete and give a Green Signal)
  2. Data Services DF read the .DAT File as source and load to your Taget DB/File

If you observe here, there is a hard dependency. Let us take a Sorce Table in ECC having 10 million record. ABAP has to write the 10 million record on a .DAT File and then DS has to start loading 10 million records.

So in case of RFC Streaming, a RFC stream connection is established and you open up a pipe to that corresponding ABAP Program which will directly write to the pipe and DS directly start loading the target as and when the chunks of data come in the pipe. Thereby removing that hard dependency.

So RFC Streaming is still an ABAP.
You will have to just upload/re-transport the ABAP after setting RFC in Datastore. If you will read through the ABAP code/compare to old ABAP, the place where it writes to the file, the new ABAP will show that the data written into the RFC pipe you created.

I hope I explained it and not confused you.


ganeshxp :us: (BOB member since 2008-07-17)

Thanks a lot Ganesh for the clarifications. Just a couple of questions.

  1. "RFC limit of 10 minutes which is basically the RFC Read Table approach. "

While using RFC streaming transport method - I used to get this error after 10 mins, we used to ask basis team to increase time limit to 20 mins. It used to be successful.

So my question is: in this scenario was RFC streaming was being used or RFC read table. Our intention is to use RFC streaming. seems like opposite is happening. Please correct me if i am wrong.

  1. “You will have to just upload/re-transport the ABAP after setting RFC in Datastore”

If i am using the table in a regular DF, we can’t generate ABAP code. My intention is to avoid ABAP or R/3 DF.

  1. I am extracting data from large table BSEG,BSAK - cluster tables. RFC streaming is very slow and fails even after making the SAP setting for 30 mins.

How to resolve this. is ABAP programme the only option ?

  1. If i use Custom extractor, can large table be extracted w/o any limitations. what actually happens in the background for Extractors ?

learndi_2011 (BOB member since 2011-03-03)

Simple one liner…

If your SAP Table is in regular dataflow, then this is RFC Read Table and not RFC Stream…


ganeshxp :us: (BOB member since 2008-07-17)

Hi Ganesh,

But the below post suggests otherwise. please check the comment also.

http://wiki.scn.sap.com/wiki/display/EIM/Reading+via+RFC+Read+Table

I believe the comment is added after discussing in the post.


learndi_2011 (BOB member since 2011-03-03)

Hmm. I do see that

The program that SAP Table in regular DF will use is stream. I will try to watch the ECC system/confirm with my BASIS team and see this.

If we will delete the RFC - SAPDS will the table in regular dataflow break too? Is that a test case to prove that table in regular DF use a RFC Stream?


ganeshxp :us: (BOB member since 2008-07-17)

Yes, we can delete that and see.

  1. in a regular DF, is there a way we can check if RFC_READ_TABLE or RFC_STREAM_READ_TABLE is being called. sm37 t code ?

  2. can you please check my other questions also.

Let me know your findings. Appreciate all your help !!

This question has keep on bugging me for the past 6 months :hb:


learndi_2011 (BOB member since 2011-03-03)

Here your question is answered by Werner:

With other words: RFC read table is replaced by RFC streaming. Jay for the development team and the guys who thought up the specifications!


Johannes Vink :netherlands: (BOB member since 2012-03-20)

Thanks a lot for the confirmations.

What advantage it has adding in Abap df…instead of regular df ?
Also can you please have a look at other queries in this post.


learndi_2011 (BOB member since 2011-03-03)

Ah I did not see your list of questions :wink:

Question 1 is answerd, with that RFC streaming is also used when reading a R3 table directly in a DF.

  1. “You will have to just upload/re-transport the ABAP after setting RFC in Datastore”

Only applicable when you use a R/3 DF.

  1. I am extracting data from large table BSEG,BSAK - cluster tables. RFC streaming is very slow and fails even after making the SAP setting for 30 mins.

As you said, big tables. BSEG is often the biggest of them all (MSEG maybe surpasses it?). You cannot extract BSEG directly, unless you want to wait for it for hours or days. Only viable alternative is to join with BKPF and select on creation date, or document date and so split your selection.

However, with a SAP table directly in a DF, you cannot join multiple SAP tables in SAP! Only option is to use R/3 DF’s…

  1. If i use Custom extractor, can large table be extracted w/o any limitations. what actually happens in the background for Extractors ?

Extractors: no clue.


Johannes Vink :netherlands: (BOB member since 2012-03-20)

Thanks once again.

Just a final few questions to get my understanding correct on SAP->BODS connectivity.

  1. What is the real advantage of Abap df (RFC transport method)…instead of regular df RFC streaming ?

  2. I would like to avoid RFC timeout error with RFC streaming in a regular DF. Even if i set to run it in background, it is still failing after exactly 10 mins. I suspect it is not running in background OR RFC calls can’t be run in background ?

  3. need to know if we can avoid time out errors with Extractors.


learndi_2011 (BOB member since 2011-03-03)

The ability to push complex actions back onto the SAP ERP/CRM system. In ABAP DFs you can join tables, call functions etc. which all becomes an ABAP program generated by SAP Data Services, which is then uploaded to your open development client on ERP/CRM and from there you will need to transport it to UAT and Production, with all appropriate testing steps in the middle etc.

No, this is not normal. I have used RFC Streaming to get really large SAP tables extracted out of ERP (ECC 6.0) using RFC Streaming with the background option enabled. Without this, it would indeed just abort after 10 minutes.

In ERP/CRM, ensure that your RFC Destination is setup properly and that your Data Services user has sufficient privs to also run background jobs - and that sufficient number of background jobs are allowed by the basis team.

Avoid the BW Extractors if you can? :slight_smile: Some of the extractors I’ve had to work with on SAP CRM were so incredibly slow and sluggish, they just kept on dying on us no matter what. But looking into the SAP BW logs, we could also see that the problem wasn’t just Data Services - also SAP BW was struggling to get some of the data out of SAP CRM. Some of the extractors can be real pigs - using ABAP DFs may be a better option here.


ErikR :new_zealand: (BOB member since 2007-01-10)

Thanks a lot for the clarifications. Certainly helps :slight_smile:


learndi_2011 (BOB member since 2011-03-03)

Okay, I confirmed that RFC Read Table and RFC Streaming in ABAP Dataflow uses the same RFC Streaming process.

However, in the direct SAP Table 2 things to be noted:

  1. Job will be run in dialog mode than the Communication mode (usually 10 minutes)
  2. Joins of any SAP Table is done after bringing the data into Data Services. Hence not so efficient on big tables

My take on extractor, is simply this.
If you will know the logic of what the extractor does, then you point to the table and you do the logic better. Even for simple ones. We had used Extractors and whenever we touch one such job for enhancements, our policy is to change to physical table if possible. In some cases, I couldn’t even understand why the extractor would miss a couple of columns in its default structure. Anyhow I will use it only for complex things like Inventory process, FI_GL etc.,


ganeshxp :us: (BOB member since 2008-07-17)

which type of user should we create with ECC/BW. Communication or system ? can we execute jobs in background with both type of users ?


learndi_2011 (BOB member since 2011-03-03)

Communication. I suggest your read the SAP part of the manual :wink: Also there the authorization setup is listed.


Johannes Vink :netherlands: (BOB member since 2012-03-20)