Extracting Data from SAP

Hi,

we are facing problem while extracting the data.
Tables have text fields where users were able to insert more than one line. These fields are causing our exports to massively fail, but we cannot really expect that business would go back and fix them, since there are hundred of thousands of records to be fixed. I believe they would expect to see these fields again as multiline when they are loaded.

Help me out to overcome this and extract them successfully.

Thank You very much…


naveen3733 (BOB member since 2010-12-06)

So you mean to say you get the Error Message “Found a Row delimiter… when expecting a column delimiter…”
Is that correct?
So you do create transport DF and send the flat files to DI Server and loading it. Am I right?


ganeshxp :us: (BOB member since 2008-07-17)

Yes your right i’m getting same kind of error message… something like “row delimiter… column delimiter” . We are getting this error while extracting data from SAP.


naveen3733 (BOB member since 2010-12-06)

Haha well there is no solution to this in DI. You need to write a SAP Function to remove the Form Feed/Line Feed/New Line Characters.

Main reason is, you cannot use the replace_substr_ext in the R3 Dataflows :wink:

Ask your SAP Basis team to write a function for this. It is a simple one.


ganeshxp :us: (BOB member since 2008-07-17)

can’t we handle this CR/LF issue in designer while extracting… with some logics…


naveen3733 (BOB member since 2010-12-06)

No this is definitely not possible.

Why?
–> On your SAP System you run the ABAP and dump the data into the flat file
–> Then this file gets transferred from SAP Server to the Job Server machine
–> Then the first place this file gets read is the R/3 Dataflow.
–> There is the replace_substr_ext() function which can handle this special ASCII characters. But the problem is this function is not recognizable in the R/3 Dataflow.

So we get the error when we enter in the DI System. So there is a very simple function in SAP to do this. I will share the name if it. I am not having it handy now.


ganeshxp :us: (BOB member since 2008-07-17)

i think we can keep a transform between abap dataflow and target table and use the replace_substr_ext function in that table.

I’m testing with that right now… Let me get back with the updates.


naveen3733 (BOB member since 2010-12-06)

No this error will occur in the R/3 Dataflow and not in the main dataflow in which R/3 DF resides. So already you would have got the error when you enter the main DF?


ganeshxp :us: (BOB member since 2008-07-17)

YUP ur right… can u tell me what we should guide to BASIS team to overcome this issue…


naveen3733 (BOB member since 2010-12-06)

We wrote a SAP Function and called it in the R/3 DF and eliminated those characters. We used something like this

Wrap the below in a SAP Function and use in the R/3 DF after importing it.

NOTE: THIS IS A SAP STATEMENT. THIS WON’T WORK IN DI.

REPLACE ALL OCCURRENCES OF cl_abap_char_utilities=>newline
    IN <I_O_PARAMETER> WITH ' '.
REPLACE ALL OCCURRENCES OF cl_abap_char_utilities=>cr_lf
    IN <I_O_PARAMETER> WITH ' '.

ganeshxp :us: (BOB member since 2008-07-17)

Just a private notice between you and me:
In the next release, end of Q4 as it seems, another transport method will appear in the SAP datastores called RFC_Streaming.

  • Faster
  • No files written
  • binary transfer and no issues with weird characters like carriage return line feeds anymore
  • no ftp, shared_directory setup issues

It did cost us a lot to make it as fast as it is, now we are working on stability.


Werner Daehn :de: (BOB member since 2004-12-17)

Cool that’s cool.

But would there be any handy way to convert our existing R/3 DF’s to that
RFC_Streaming? :stuck_out_tongue:

Ha I am just a human. I keep expecting more more and more 8)


ganeshxp :us: (BOB member since 2008-07-17)

I need the fix for now… i can’t wait till next release. I heard that we can fix by installing Data Services Transports on the SAP system. Do u have any information that what kind of data services transports needs to be installed. Where can we find those transport files.


naveen3733 (BOB member since 2010-12-06)

The transports do not help either. The solution is, as explained in one of the above posts, to write a SAP function that strips away the characters, then import the function into the SAP datastore and use it inside the mapping of an ABAP dataflow.

(Regarding RFC Streaming, this is a new transport method. So you do not have to change anything in the dataflows, just change the transport method of the datastore and that’s it. Well almost, the ABAPs need to be regenerated and uploaded for production assuming it runs in execute_preloaded mode but that’s one click in Designer.)


Werner Daehn :de: (BOB member since 2004-12-17)

Resurrecting this thread…

We are still on v4.0 and have created the SAP function to strip out the offending characters. Works fine. We are struggling a bit with data typing as an obstacle to making this a single, generically reusable function.

Initially, we typed the input and return parms in SAP as STRING - BODS choked on that as an unknown data type. Moving to VARCHAR(x), we went with a generic varchar(4000), assuming we could map anything smaller (say varchar(255)) and substring the return value. The ABAP generator wants to pass in the type/length of the input source column in the SAP table and throws a type mismatch (255 vs. 4000). Nothing we have found cues the ABAP generation to say “hey, we are passing you a varchar(4000) - ignore where it came from”.

Does this mean that we need a different “Remove CRLF” function for each length input column we need to scrub like this? Is there something we are missing to get around this?

Thanks


petersjd (BOB member since 2003-11-21)

Found a workaround, a little kludgy, but effective. Perhaps there is a more compact way to do it, but it’s working now as a generic function that accepts a varchar(4000):
In the ABAP Data Flow:

  1. in an initial query transform, map the input column from the SAP table to a column of a different name (Z_xxxxx) of varchar(4000)
  2. in a second query transform, call the function with substr(Z_xxxxx, 1, 4000) as the input parameter, mapped to a column of varchar(4000)
    In the main data flow, I then substr the incoming column to the original length in the initial query transform.

I would have thought that either the mapping to varchar(4000) or the substr(x, 1, 4000) might have done it alone, but that does not seem to work, but when I combine them in the ABAP data flow, it processes correctly without any errors.


petersjd (BOB member since 2003-11-21)

Thanks, I tried simply changing the type and it didn’t work as you say, I’ll try the combination.


craiggsmith (BOB member since 2009-06-01)