Loading OHD into BO tab crashes ("value too large for c

Hi, here is my challenge of the week. Looking forward to your ideas.

I am loading a DSO via OHD into a BO table. There is a varchar(25) matching a CHAR(25) InfoObject in the DSO. For that field, after reading 400k records successfully, I get an ORA-12899: value to large for column (actual: 26, maximum: 25).

Is this a unicode / encoding issue? What kind of automatic conversion takes place in OHD handling in BO? is there anything on this subject in the reference manuals that I overlooked?

Or is it possible that this error message pops up in case of database configuration issues? (table space overflow? something related to extra string termination bytes?) It strikes me as unlikely, but I am not an Oracle expert.

Any ideas welcome.

thanks,
matthias


deskremark (BOB member since 2012-11-01)

Very likely your target table is set to varchar2(25 bytes), so it is constraint to whatever fits in 25 bytes. With UTF-8 some chars require 2 bytes (or more) and then this would fail.
SAP and DataServices use the CHAR semantic. So you could change your table to varchar2(25 chars).


Werner Daehn :de: (BOB member since 2004-12-17)

Do you mean this?:

Datastore directory -> find table -> context menu entry “Open” -> find column -> context menu entry “Properties” -> Data type pull down menu?

There is an entry “varchar”, which was selected, but no entry “varchar2”.

I will try to just double the number of characters and see what happens. I wonder if the characters are going to get garbled because of broken / missing unicode conversions?


deskremark (BOB member since 2012-11-01)

update: I realized the error happens in the OHD, not in the target table. I changed the column in the OHD schema as well, and there is also no menu item “varchar2”.

when running the job again, the error persists. i had to delete the query and create a new one because the change in the OHD schema would not propogate to the query. but with the changed query, the error is still the same. now all objects in the BO Dataflow (OHD, Query, target table) all agree that the column triggering the error is 50 bytes long, but the job thinks it is still 25 bytes long.

perhaps it is not allowed to change column types of imported OHDs? it certainly doesn’t seem like a good idea…

some more context from the job log: the first entry is “utf8 coerced to utf16”. later on, the transcoders for the BW datastore and for the BO_STAGING datastore are initialized "to transcode between engine codepage utf16 and datastore codepage ".

should i open another OSS on this?

not sure if i asked this earlier in this thread: is there any comprehensive documentation on unicode and SAP BW and SAP BO?

thanks!
cheers,
matthias


deskremark (BOB member since 2012-11-01)

So you are using template tables? I was assuming somebody else did create the table in Oracle, sorry.

In the template table, the options, is one flag “use nvarchar”. This will create nvarchar2() datatypes in Oracle and this is an Unicode datatype with char-length semantic.


Werner Daehn :de: (BOB member since 2004-12-17)

Thanks, you nailed it again! (-: Problem solved.

just completed a data load; contents of target table looks good at a preliminary glance.

(Sorry, I should have mentioned the table origin. I was using a template table that I “imported”. The successful test just now was with a fresh template table.)


deskremark (BOB member since 2012-11-01)