Long data type on table_comparison

Is there a way to compare long data columns on table_comparison?


rajeshvr06 (BOB member since 2012-10-05)

I don’t think so. CLOB or BLOB data?


Werner Daehn :de: (BOB member since 2004-12-17)

It is defined as CLOB data type in Oracle database and bods converted it to as LONG data type.


rajeshvr06 (BOB member since 2012-10-05)

So it is character data. Would it be possible to change the target to varchar(50000) or so? DS can convert CLOBs to varchars.


Werner Daehn :de: (BOB member since 2004-12-17)

Or you could hold a hash of that field as an additional column and compare hashes if you just want to find out if it has changed.


HerdplattenToni (BOB member since 2011-05-13)

Werner:

Do you want me to change the data_type on Oracle? Can you please give me a clear suggestion?


rajeshvr06 (BOB member since 2012-10-05)

Yes, that is what I had in mind. You read the LONG/CLOB from the source, in a first query you use the long_to_varchar() function to convert it and then you load this varchar into the Oracle target - where the datatype in the table has to be varchar2 as well.


Werner Daehn :de: (BOB member since 2004-12-17)

how do u define a column in oracle as more than 4000 for varchar2? The limit is 4000 correct.


rajeshvr06 (BOB member since 2012-10-05)

Is it a way to modify the sql statement prepared by the Table_Comparison transform? If it is possible, then i will change the sql statement to use “LIKE” on the clob column condition instead of the “=” sign. Like below:

Update emp
set empname=‘xxxx’
where empname like ‘yyyy’; – Here empname is declared as CLOB type in oracle, so we can’t specify a condition with equal sign.


rajeshvr06 (BOB member since 2012-10-05)

Instead of varchar, you have to declare the datatype as CLOB in Oracle and call in DI as LONG. From LONG convert the datatype to varchar using long_to_varchar() function. Then the field size can be of any length even 100K

Arun


Arun.K :us: (BOB member since 2011-10-18)

Arun:
My requirement is something different. I will explain clearly where i’m stuck.

==========================================
Requirement:
Loading file data to Oracle table. If the same row already exists, then it has to update the existing values of the same record.

==========================================
File structure:
User_id Int
User_details Varchar(7500)

Table structure in Oracle:
User_id Number(10)
User_details CLOB

==========================================
My design is:
File(Source) --> Query_Transform --> Table_Comparison(with table User_remarks) --> User_remarks(Target)

==========================================
Table_Comparison:
[selected] Input contains duplicate keys
Comparison method: Row-by-row select
Primary key columns: User_id
Compare columns: User_details

==========================================
Query_Transform in the design shows below structure:
Schema In:
User_Id int
User_Details varchar(7500)

Schema Out:
User_Id decimal(10,0)
User_Details long

==========================================
Table_Comparison in the design shows below structure:
Schema In:
User_Id decimal(10,0)
User_Details long

Schema Out:
User_Id decimal(10,0)
User_Details long

==========================================
While executing the job i’m getting following error:
Error:
LONG or BLOB column <User_details> cannot be used as a compare column in a Table_Comparison transform.


rajeshvr06 (BOB member since 2012-10-05)

OK. So the issue is table comparison is comparing LONG in the source with CLOB in the target and throwing an error. Have you tried varchar in the query transform before the TC?

Arun


Arun.K :us: (BOB member since 2011-10-18)

Hello,

I am facing a similar issue:

  1. Oracle database has columns defined as CLOB datatype.
  2. DI has datastore tables defined with LONG datatype columns.
  3. Table Comparison throwing the following error:
The transform <Table_Comparison> cannot use a LONG or BLOB data type as a compare column. (BODI-1116083)

@Rajesh, did you ever find a solution or workaround?

@Arun, I tried using the long_to_varchar() DS function prior to the TC, but am getting the following conversion warning error:

[Table Comparison:Table_Comparison] 
Cannot convert the type from <varchar(50000)> (source column:COMMENTS) to <long> (target column:COMMENTS). (BODI-1111035)

@HerdplattenToni, how to generate the hash value? Using DBMS_CRYPTO or DBMS_OBFUSCATION_TOOLKIT?

@Anyone else, any other suggestions please?!


Darth Services :uk: (BOB member since 2007-11-20)

In the end I used the Oracle SYS.DBMS_CRYPTO.HASH function.

I used this in DI by creating a SQL transform that contains the SQL to pass to the contents of the previous staging table:
e.g.
SELECT
dbms_crypto.hash(CLOB_COLUMN,3) as CLOB_COLUMN_HASH,
CLOB_COLUMN,
ANOTHER_COLUMN,
YET_ANOTHER_COLUMN
FROM STG_DM_WHATEVER_1;

This way the hash generation is pushed down to the database and I do not have to worry about creating any functions in the datastore.

In a later dataflow containing the Table Comparison of the final staging table and the dimension table, the TC would compare the HASH of the long column (which is a varchar/char), and all other non-long columns. It would not compare the long column. Using the columns examples from above the TC would compare on:

CLOB_COLUMN_HASH
ANOTHER_COLUMN
YET_ANOTHER_COLUMN


Darth Services :uk: (BOB member since 2007-11-20)