BusinessObjects Board

Need to Extract few SAP tables with data from 35Mill -200 Mi

Hi we are extracting data from SAP using ABAP data
flows and loading into SQL server on cloud.
Most of the tables have been loaded but anything more than 30 million rows we are facing issues, like unknown exception and jobs failing.
What kind of optimization is recommended in this case ?
We have dropped the PK’s , tried to run it in background by tick marking the option in the data store…anything else we need to try ?

Any help appreciated.


zuluking (BOB member since 2012-05-31)

What is your data transfer method? FTP or RFC?


eganjp :us: (BOB member since 2007-09-12)

I’ve had no problems extracting 20-60 (and more) million records from ECC 6.0 on HANA using background-enabled RFC extracts.

SAP always seems to recommend using ABAP data flows with RFC Streaming, which certainly does work as well but we’ve had no issues extracting large volumes using the simple RFC Streaming interface as well, nor did we notice any real performance differences. Which really isn’t a huge surprise as both methods use the same output mechanism, that being streaming records out in buckets of 5000 records.

Are you extracting from an on premise based SAP ERP environment into a cloud based SQL Server instance? Or is your SAP ERP environment also in the cloud?

If your ERP is on premise, I would staging the data locally first, preferably in a database otherwise in text files if you have to - and then load into the cloud based SQL instance. At least to investigate where you are running into problems.

Also ensure that your RFC connections are background enabled as they will time out as foreground jobs which such volumes of data.


ErikR :new_zealand: (BOB member since 2007-01-10)