Extractors Calling Error?

Hi Guru’s,

I’m fetching data from SAP system using Extractor. But When I execute the Job, getting this error :-"Error calling RFC function to get reader data: <RFC_ABAP_RUNTIME_FAILURE-(Exception_Key:TSV_TNEW_PAGE_ALLOC_FAILED)- No more storage space available for extending an internal table"If anybody have any solution then pls let me know.
I would be appreciate your Quick response :!:


rajeev_khokhar1 :india: (BOB member since 2011-01-19)

You should upgrade to PI_BASIS patch 9 or at least implement SAP Note 1533946 (If this is available to public even).

The background of the problem is, the packet size specified in the Extractor Reader (default 10000) might be too large for wide structures and caused your error message TSV_TNEW_PAGE_ALLOC_FAILED. So instead of you fiddling with the proper value, the BAPI itself now breaks the packages into chunks that are at max 200MB large if needed.


Werner Daehn :de: (BOB member since 2004-12-17)

Thanks Werner for your quick response with answer !! :wave:

But I don’t know how to upgrade the PI_BASIS patch 9 or how to implement SAP Note 1533946?

Could you please explain it.


rajeev_khokhar1 :india: (BOB member since 2011-01-19)

You can’t you have to go to your SAP team with that information. But for now, reduce the packet size in the Extractor reader object of your dataflow until it works.


Werner Daehn :de: (BOB member since 2004-12-17)

Hi Werner,

Is the note you mentioned applicable only for BI extractors as sources in BODS or is it applicable for other sources as well?

I am running in to the same error when I am using a big SAP table as my source.

Thanks


hakuna_matata (BOB member since 2012-04-27)

Applies to Extractors only, not tables.

You haven’t used an ABAP transform then to read large tables but pulled the table directly into the regular dataflow. This is only meant for small amounts of data - although in the coming 4.1 release this limitation is gone.


Werner Daehn :de: (BOB member since 2004-12-17)

Excellent!

Thank you for your reply


hakuna_matata (BOB member since 2012-04-27)

Hey Werner,

When you say coming in 4.1… we are on DS4 SP2fp5 at the moment and hitting this error (loading data into HANA).

Are you saying that if we don’t want to use extractors but rather pull direct from a DSO, and havent got openhub in our source env to use, we need to wait for the SP4 realease of DS?

Thanks!
Jez.


jezbraker (BOB member since 2009-02-20)

Hi Werner,

We are with DS 4.1 and I have seen that in ABAP Dataflow you can now specify the package size, but we are using Extractors (not ABAP Dataflow), how can we set the package size??

The extractors we are using are sending packages of about 100k rows and in Delta mode we get a SYSTEM_NO_ROLL error, so we would like to reduce it to 50k.

Thank you in advance

Regards,
Antonio


inver5 :es: (BOB member since 2012-05-23)

When you open the Extractor in the dataflow, you will see an array size parameter. But if I am not mistaken, it is ignored still.

The logic of setting the array size is actually done inside the ODP API itself. Hence I wonder if you have all SAP Notes installed on the SAP side regarding ODP. Could that be the case?

We had these issues about 2 years ago but not in the last year or so, no where.


Werner Daehn :de: (BOB member since 2004-12-17)

Hi again Werner,

As far as I know, we have the ODP notes installed as we have been working with 0_FI_AP4 and GL_10 with no problems with initial or delta loads.

But when working with extractors like GL_4 and AR_4 (that are huge in my situation) I want to reduce the package size.

I’ve checked the table ROIDOCPRMS and I see MAX_SIZE parameter but it applies just to BW environment but I’m working directly from SAP ECC to BODS.

Where can I configure the package size for those extractors? Either in DS or ECC side…

Thank you


inver5 :es: (BOB member since 2012-05-23)