BusinessObjects Board

BODS - source: Postgresql, target: Oracle - problem with text fields

Hi all, I am using Postgresql as source and Oracle as a target.
I was able to connect BODS to source via JDBC adapter and extract data except the text fields.

When I import tables from Postresql text fields are marked as Varchar(2147483647). In the Query I used: substr, long2varchar conversion, decreased the precision to 2000 - no succes. The error message I got is :

|Session JB_CITNETREP_LOAD_TEST1|Data flow DF_XT_CTN_CHANGEITEM
Data flow <DF_XT_CTN_CHANGEITEM> received a bad system message. Message text from the child process is
<x£￘゚1==========================================================
Collect the following and send to Customer Support:
1. Log files(error_, monitor_, trace_*) associated with this failed job.
2. Exported ATL file of this failed job.
3. DDL statements of tables referenced in this failed job.
4. Data to populate the tables referenced in the failed job. If not possible, get the last few rows (or sample of them) when
the job failed.
5. Core dump, if any, generated from this failed job.

Do you have some suggestions how to handle text fields ?

Thank you in advance,
Valentin

What size were the Postgres text fields? Varchar(max) in SQL Server is a large storage field - it may be that you need CLOB or BLOB field in Oracle to support your transformation if the Postgres source is large.

Generally I’ll try and find the longest string in each column as well as considering any business rules that are in place. If it’s an ETL process, land the data into Oracle first then transform it rather than trying to do the E & T of the ETL in one step.

The max Size I have is 5000 characters on DEV env.
I tried to land data in Oracle(no support for CLOB in BODS 4.2. SP 6.6) and directly to write it in a csv - same error.

I guess I am stuck in reading the big text fields (I was able to read/write a relatively small text field ).
Can it be related to JDBC driver?

It’s a while since I’ve used Oracle - is Long supported for you as a data type?

Also, if you’re importing into a data warehouse, I’d question the worth of a 5000 character string in a warehouse.

create a View in Postgres to read/format/transform your problematic data and extract from that View.

I had a similar problem with Microsoft SQL Server, where Varchar(max) were translated in Long when mapping the tables. And then this Long datatype was causing any sort of problems…

I solved the issue by manually editing, in the datastores, the mapping of the source tables with Long values, changing them in Varchar(5000), and then threating them in this way until the final output (mapping of the input tables, any intermediate datatype, and in the output tables).
I am sure that those fields will never contain anything longer than that.

If you can have a confirmation by the owner of the DB on the max lenght of your fields, I’d try to change them in Varchar(YourValue)