Now the big question is: how this is perform in the real world? What are the advantages and disadvantages? What are the next limits? Are there limits in the width and depth of SAP tables? Amount of data?
We are planning in some moths time to start testing BODS 4.1. Until that time:
Does someone have experiences to share? And if so, could you describe the size of the SAP box, amount of data transferred, typical tables accessed and if used also the delta mechanism (CHDR etc).
The no limitation on depth and width is interesting as RFC read table did have some problems with that.
Is this because normally a RFC call is in the foreground, but with the setup in tx sm59 with a RFC destination it runs in the background?
What happens then? Because the wiki says:
Does a CR still break rows? Or will the text be correctly transferred including the CR? Because in the BODS engine we can filter on certain hex values and thus filtering unwanted CR’s in text fields.
Werner, something else. If with RFC streaming we still have problems with unwanted characters, is there a possiblity to include in the BODS RFC functions a function that cleans up unwanted characters on the SAP side? Every BODS-SAP project has problems with this…
I can imagine that and there is only one way to find out
A colleague of mine informs me that RFC streaming is only available when the ABAP execution option is set to ‘Generate and Execute’. Can this be true?
I’m not aware of any sensible project that allows ‘Generate and Execute’ in a production environment so this would be a major limitation.
RFC is added as one of the transfer method, so I don’t think it is available only in generate and execute, it should work for both Generate and execute and excute preloaded
I have removed the line “does work with unexpected chars” from the Wiki. I talked to the developer and we are using a delimited row model internally, not a fixed-width as I thought initially.
Yes, RFC_Streaming supports execute_in_background yes and no, as well as generate_and_execute and execute_preloaded. That was in fact the difficult part.
Just keep in mind how that works:
The ABAP does collect a batch of rows in an ITAB memory table and before wrote these 1000 rows into a file. Now, when RFC_streaming is used, the ABAP calls a BAPI function on a remote SAP system. Only that this remote SAP system happens to be the al_engine process.
Can an ABAP running in background call remote SAP systems? Of course it can.
Can an ABAP that was deployed in production call a remote SAP system? Of course it can.
There is just one thing to watchout, when running in background you have to have a SAP RFC destination created. In dialog mode (execute_in_background=no) there is an active RFC connection from the al_engine to the SAP server which the ABAP can use to call back the al_engine. In execute_in_background=yes there is no active connection, hence the ABAP has to create such - using the RFC destination as hook.
Werner:
Too bad about the special characters. I need to create specifically an accoun to vote on the idea, but basically this is a problem that is around for years. We can live with it, but for financial interfaces where every bit of data needs to be extracted this is a real pain in the ***.
About the RFC streaming: with other words, only tx SM59 with the RFC destination is required when you want to run in the background? So for a quick system setup only the BODS transports and a SAP user with correct rights are required?
I am a bit confused about the status of using sap tables directly in normal dataflows. Notwithstanding the comments about strange characters, Are there limits to the row/column widths and volume issues as their are in DS4.0 ?
Comments from some posters here suggest there still is, but the following wiki page implies to me there isn’t:
What I know for 100% is that width and number of rows does not matter with rfc streaming, ABAP dataflows and regular dataflows.
My developer said, I believe to have understood at least, that the extra chars are no problem for the abap table in the dataflow but for abap dataflows they are still. Why??
Anyway, I need to look into the source code to get to the bottom of this.
If anybody sees anything that contradicts my statement, please let me know. We would then make sure rfc streaming is used actually and then we could start investigating.
If I interpret the table correct then with a SAP table directly in a DF the function RFC_READ_TABLE is used (the hyperlink goes to the page of that function).
For larger amounts of data it is recommended to use the new RFC streaming, which implictely means that a R3 DF is required, as RFC streaming is only a new transfer method for R3 DF’s.
nope rfc streaming is used for both ABAP dataflows and tables in the dataflow directly.
In the latter case no abap code is generated, none needs to be uploaded.
Someone will need to update this page on the wiki then as it states what Johannes believed. I added a comment to the page, but don’t know if many people read comments on the wiki.
We are using DS 4.1 SP1 environment.
Created a new Datastore for SAP.
Data Transfer Method: RFC
RFC Destination: SAPDS (Created a new TCP/IP connection and enabled it via DS Management console)
While extracting data from Table, we have a source property as (Execute in Background). If we check this option, we do get the error as mentioned below,
Error calling RFC function to get table data: <RFC_ABAP_MESSAGE, SY-MSGTY: E, SY-MSGID: S#, SY-MSGNO: 047)- Processing Terminated
If we do not enable this option, then we are able to get the data from the source table.
Is this an error due to the authorization issue(whether user ID should be given SAPALL permission) as we have installed the transport given by SAP for 4.1 (revised authorization).
Just want to know, what is the difference between Datastore “Execute in Background” and the source table property “Execute in Background”? and how to resolve this error?
Whether this RFC connection “SAPDS” will start extracting data from source instead of just enabling the connection?
I got my job working without setting this. Then I set the check-box.
I tried this and initially I got an error (not the same as yours though).
1756 5624 R3C-150605 22/05/2013 12:45:33 PM |Data flow DF_SAP_PS_TR_CONTROLLING_OBJECT_DOC_HEADER|Reader COBK1|Transform COBK1__Driver
1756 5624 R3C-150605 22/05/2013 12:45:33 PM The SAP job was canceled for host < >, job name , job count <12451900>, job log from SAP
1756 5624 R3C-150605 22/05/2013 12:45:33 PM <20130522, 124523, Job started
1756 5624 R3C-150605 22/05/2013 12:45:33 PM 20130522, 124523, Step 001 started (program /BODS/RPT_STREAM_READ_TABLE, variant &0000000000000, user ID DSERVICES)
1756 5624 R3C-150605 22/05/2013 12:45:33 PM 20130522, 124523, The destination SAPDS has not been maintained! (see long text)
1756 5624 R3C-150605 22/05/2013 12:45:33 PM 20130522, 124523, Job cancelled after system exception ERROR_MESSAGE
1756 5624 R3C-150605 22/05/2013 12:45:33 PM >.
The RFC connection has been setup in SAP system in SM59. Do we need to establish the RFC connection in DS Management console once we do the setup in SAP?
What is the access level that the User ID possess which you are connecting via BODS?