we are extracting data using sap functions in our system.
The sap application datastore holds our imported functions (set to parallel execution)
Data is loaded afterwards into a MS SQL 2012 database.
Day after day without changing anything to it, the dataflow with the rfc function takes more and more time.
We are now at 3 hours for a poorly 66.000 records.
How can we solve this? Should we ask a abap programmer to take a look at the function in SAP itself?
Where do we have to pay special attention to when debugging the remote function?
I think its custom, its called Z_HIER_DWH
The input parameters are
MANDANT
SETCLASS
SETNAME
The amount of data in the source tables, I am not aware of it, but since they created a custom rfc for it, I think it might be big tables
About the accessed tables, I am also not aware of them.
So its difficult to decide to use the standard hierarchy extractor for it.
About background jobs, do you mean the option on datastore level (execute in background (batch) ? This parameter is set to yes or do you mean another parameter?
To figure out the tables accessed you need to look at the ABAP code of the function. tx SE37 or tx SE80.
If the only tables accessed are SETHEADER & SETLEAF, then those are standard SAP tables. The input paramters for the custom function seem to suggest that.
If so, the standard hierarchy extractor from BODS can be used. It is quick, even with 60k records (we are talking about seconds).
The background job I meant is for a r/3 flow set to execute in the background. The setting in the datastore influences that, but does not impact the custom rfc.
I thought so
To figure out the abap code, I need access to the system I am afraid
Keep you posted, tomorrow I will have a look with the project manager to see what we can change about the function.
Thanks already
Edit: apparantly something went wrong when I imported the template tables (for 1 table I forgot to check the delete data before loading) so each time he was sending more and more info through the rfc function)