BODI Clob and Blob

Hi all,
Does BODI support the data types Clob and Blob and how does it work in Oracle?

Thanks in advance!

Sidi.


hammamr (BOB member since 2007-11-05)

We support CLOB data only. Binary will be supported in the next release.


Werner Daehn :de: (BOB member since 2004-12-17)

Correct me if I am wrong. CLOB data of size uptop 4KB only is supported through the “Unsupported datatype” feature. Right?


adiga (BOB member since 2006-11-22)

No, bigger than that.


Werner Daehn :de: (BOB member since 2004-12-17)

We have found that a MS SQL CLOB type (datatype TEXT) only works up to 8K. Larger than that, and you get truncation warnings from the database target, even though the source, target, and all intermediate steps are defined as a LONG in DI.

I have the vague impression that you have to handle sizes larger than this in a “special” way at the ODBC driver level.


dnewton :us: (BOB member since 2004-01-30)

Sorry if its a dumb question. The manual says that:
“Data Integrator converts a clob data type input to varchar if you select the Import unsupported data types as VARCHAR of size option when you
create a database datastore connection in the Datastore Editor.”

And I can give only 4000 here as the maximum size.

Am I missing something here?


adiga (BOB member since 2006-11-22)

For varchar that’s true. The datatype inside DI should be LONG for CLOBs!


Werner Daehn :de: (BOB member since 2004-12-17)

<We have found that a MS SQL CLOB type (datatype TEXT) only works up to 8K. Larger than that, and you get truncation warnings from the database target, even though the source, target, and all intermediate steps are defined as a LONG in DI.

I have the vague impression that you have to handle sizes larger than this in a “special” way at the ODBC driver level.>

we are facing the same issue right now , Could you please explain what is this “special” way we have to follow?
I used the ODBC as target and getting the “Right Tuncation Error”.

Thanks in Advance


jar80 (BOB member since 2005-10-06)

It is “special” meaning DI itself, when inserting into the database, needs to use different API calls with the SQL Server database. There’s nothing you or I, as users of DI, can do. I’ve never found a way around it.

Oddly enough, we just ran into this problem today, after having been quiet on it for months. This must be the Long Data Problems Day.


dnewton :us: (BOB member since 2004-01-30)

This thread seems to imply that you can’t actually load data into a long database field without truncation. Can this be true or am I missing something?

I’m loading a long field in DI into an Oracle LONG field and the data is truncated to 4000 chars. The following warning is displayed for each record:

29802 3992845232 RUN-051009 19/02/2008 16:41:08 Warning: Data truncation occurred while loading a LONG column for table target .

All the fields are defined as LONG and I’m using the varchar_to_long() function to populate the field. If I send the output to a file it’s all there, if I send it to Oracle it gets truncated to 4000 chars.

Is there any way round this to get the full data in Oracle?


jpcoley :uk: (BOB member since 2008-02-19)

Caveat: I have not tried this in DI, I used this solution long ago with a different product that had similar truncation issues.

It is really ugly, but you could try the scheme we used. Have a temporary (staging) table that all your data goes into. For each CLOB you have n fields, where n is the maximum number of 4k blocks that can be contained in the field. Split up the data in your DF into these 4k blocks, and write them out to the n fields. Then execute an SQL () in your final script that is an insert into the final table and concatenates the n CLOB fields together into a single CLOB.

Did I mention this is really ugly, performs badly, and should only be used in a dire emergency? =) We used it for a one time backload. It got the job done, but took alot of babysitting and manual checking.

  • Ernie

eepjr24 :us: (BOB member since 2005-09-16)

Hi jpcoley,
Late reply but i think it works …
There is a setting in DSconfig which limits the Long data on Job Server.
That solved our problem.
Raj


jar80 (BOB member since 2005-10-06)

Really? Interesting. Do you know which setting it is?


dnewton :us: (BOB member since 2004-01-30)

Do you mean this entry “MaxLongColumnSize=10000”?

Do you know the maximum size of the long column supported in DI?

Thanks !


ohtiek :hong_kong: (BOB member since 2006-09-03)

We are reading from an image column in SQL Server 2008 and writing to an image column on on a different SS 2008 database. DS 12.2 imports the image columns as blob. DS reads it fine. But when we try to write using bulk load we get a SS warning ‘Conversions not allowed using bcp_moretext’ and the data is not loaded. Is there a work around for this?


dlaplant :us: (BOB member since 2006-03-27)