A row delimiter was seen for row number <17> while processing column number <49> in file <Z_COPY_BSEG.DAT>.
The row delimiter should be seen after <312> columns.
Please check the file for bad data, or redefine the input schema for the file by editing the file format in the UI.
Original data from SAP ( se16n )
ZE 9.1016-2773 ZE 9.
( between 2773 and ZE is a special character like a square )
there is a “bad” character.
( between 2773 and ZE is a “bad” character like a square ).
I want to eleminate this ( and other ) char(s).
The problem is before row 16.
Inside row 15 col 264 is the “bad” char,
it splits this row into two “bad” rows…
org. error:
4044 4692 FIL-080105 26.05.2010 15:17:13 |Data flow DF_COPY_BSEG_STAGING|Reader Z_COPY_BSEG
4044 4692 FIL-080105 26.05.2010 15:17:13 A row delimiter was seen for row number <15> while processing column number <264> in file
4044 4692 FIL-080105 26.05.2010 15:17:13 <E:/exchange/PES/DAT/Z_COPY_BSEG.DAT>. The row delimiter should be seen after <312> columns. Please check the file for bad
4044 4692 FIL-080105 26.05.2010 15:17:13 data, or redefine the input schema for the file by editing the file format in the UI.
4044 4692 FIL-080105 26.05.2010 15:17:13 |Data flow DF_COPY_BSEG_STAGING|Reader Z_COPY_BSEG
4044 4692 FIL-080105 26.05.2010 15:17:13 A row delimiter was seen for row number <16> while processing column number <49> in file <E:/exchange/PES/DAT/Z_COPY_BSEG.DAT>.
4044 4692 FIL-080105 26.05.2010 15:17:13 The row delimiter should be seen after <312> columns. Please check the file for bad data, or redefine the input schema for the
4044 4692 FIL-080105 26.05.2010 15:17:13 file by editing the file format in the UI.
It shouldn’t happen in SAP that such data exists at all. So one option would be to go to the corresponding SAP screen and update that text.
The other option is to write a function module which removes none-printable characters (input is string, output is corrected string), then import that function into the SAP datastore and using that function inside the query mapping of the R/3 dataflow.
Don’t ask why there is no DataServices function to do so inside the R/3 dataflow already. I am waiting for it myself since ages.
I tried logging a case on this issue and their response was to create an enhancement request. So please check out the idea ‘A row delimiter ( or A column delimiter) was seen for row number…’ on Idea Place on the SAP Community Network.
Please click to vote on this if you are having the same issues. More votes may get their attention to get a function created for this issue.
I run most all of my delimited files through a script that converts them to fixed length. If theres a data problem, the script will either fix it or error accordingly.
A DS csv reader may not error at all. It may just decide to stop reading, and lose half your file.
Make sure you know your input/output counts of your DS csv readers, if you have field framing characters.
“Currently we use the replace_substr() and replace_substr_ext() functions for the columns of sources from normal databases and not SAP sources. We load data from front end application’s table where I would encounter Form Feed, Line Feed and Tab Characters. So it would be awesome if we have one for SAP!!! . Thumbs up for Werner’s idea”
From a mapping. You simply select the column within the query of the abap dataflow and click on functions - there the category of your SAP datastore will show up and all functions you have imported.
i’ve inserted the function call within the ABAP-Workflow (see picture)
Afterwards i imported the automated generated ABAP code into target system…
If i now execute the job on BO DS, i’m getting an ABAP dump (See attached dump)
Can somebody please support me to fix that issue?
Or does somebody know why this dump is now coming up?
—Edit—
I think it’s a conversion problem within the Function Module which i’m calling…
Background:
I’ve Table KNA1 from SAP, and the function module must do a check on NAME1 varchar(35), CITY1 varchar(50) and STREET varchar(200).
My Import-Parameter of the FM has a char(2000) input parameter…
If i e.g. handover the NAME1 field in a ABAP program to FM Import-Parameter, i get the short dump because of type incompatibility.
Ok. Next try…
Within the ABAP Workflow i added a Query where i first converted all 3 fields to varchar2000 and in second query i handovered it to FM.
If i have a look to the generated ABAP it’s using the direct mapping…that means it once again handover directly the field NAME1.
So no successs here…
Next try…
I changed the Import Parameter in ABAP FM to type “string” (then it doesn’t matter which length is coming)
If i import the FM once again, the Import Parameter is ignored?
ISN’T BO DS able to handle String-types from SAP???
If not it should create a default input field varchar4000 or something…
But ignoring is the worst solution! dump.txt (120.0 KB)
How can you use custom function if the error is still in the *.DAT file?
If you’re extracting a file from SAP i think you must remove unwanted chars from text fields during ABAP-workflow and before you’re coming into BO DS.
Otherwise the system cannot “say”, if this row or column delimiter is a valid one or not.
But it’s interesting…
How did you solve this issue with a custom function?
@BODI User:
I don’t know how familiar you are with ABAP FMs.
The trick is, that your Import-Parameter for the FM is type “ANY”.
BO DS will import that as a varchar-field.
Then you can handover every value to this ABAP FM without getting a dump and remove the unwanted chars.
Not with a custom BODS function, a custom SAP function.
And I never bothered to get something written and installed. We have over 20 SAP systems here…
What we normally do is place the most offending fields at the end of the .dat file, so that when a row delimeter is met it will not impact the remaining columns of the row.
@barthodo , I am facing a row Delimiter issue on Address field because it has ‘#’ i.e P.O.BOX # 1234 , the Client says its valid and they would want it to be loaded . In this Scenario can i still use the FM ? , the Length of the field is 60 Varchar .
Thanks