map the default keyword target table with default constraint

Hi

We have an issue with BODS and SQL Server. We have tables that have default constraints managed by the server, and which BODS is not allowed to apply a value for.

Take a look on this sample table:

--drop table testDefaultValue
create table testDefaultValue
(
            Col1 int not null,
            Col2 varchar(30) not null,
            Col3 varchar(30) not null constraint df_testDefaultValue_col3 default 'Default Value',
            [Action] varchar(100) not null
)

So far the solution was not to map these columns to begin with when adding them to the data store. This led to BODS creating SQL that did not apply any values to these columns, and SQL Server used the default value.

/*1 - insert a row with out the column that has the default constraint*/
insert into testDefaultValue (Col1, Col2,Action)
values (1,'something','insert a row with out the column that has the default constraint');

select *
from testDefaultValue as tdv;

However, we want to be able to do initial load, and here we would like to provide the values for these columns for the full load data flows only. Therefore we must add these columns to the data store definition.

What I need to know is how I can make BODS to use the “default” key word in SQL, as a value for a column:

/*2 - insert a row providing a value for all 3 columns*/
insert into testDefaultValue (Col1, Col2, Col3,Action)
values (2,'something', 'manualy added value', 'insert a row providing a value for all 3 columns');

select *
from testDefaultValue as tdv;


/*3 - insert a row using the default keyword*/
insert into testDefaultValue (Col1, Col2, Col3, Action)
values (3,'something', default, 'insert a row using the default keyword');

select *
from testDefaultValue as tdv;

How can I make bods act like the third insert query?[/code]


asrafi (BOB member since 2013-01-23)

Hi - I’m not aware of a way to make DS work like the 3rd query. SQL Server identities and defaults are a hassle to work with in Data Services. Typically I will bring the table in as-is and have separate branches in my DF for insert and update. For example, if there is an identity column, on the insert branch I will not pass that column to the target table.


brudosm (BOB member since 2014-12-12)

I can write a query against the Oracle system tables to grab the default value for a column. You should be able to do the same thing against SQL Server.

I would write a custom function in Data Services that does the query against the system tables. The input would be the table and column names. Do not use this function within the mapping of columns that use a default. This would perform poorly. Instead, create parameters to the Dataflow and in the call to the Dataflow put the function there, one parameter for each column that uses a default.


eganjp :us: (BOB member since 2007-09-12)

eganjp,

Could you please show a sample code for Oracle which reads the default value for the field.
And how to call this code from DS?


White Owl (BOB member since 2007-05-24)

select DATA_DEFAULT from user_tab_columns;

The value returned will be something like: ‘N’
The value includes the single quotes if it is a string, no quotes if it is a number.


eganjp :us: (BOB member since 2007-09-12)

Ok.

But how to call it from BODS?


White Owl (BOB member since 2007-05-24)

There are a number of ways to do that.

  1. Bring the table into your Datastore and use the table in a Query transform
  2. Bring the table into your Datastore and use the table in a lookup()
  3. Code a sql() function to get the value.
  4. Write a custom function to encapsulate any logic necessary

eganjp :us: (BOB member since 2007-09-12)

Sorry, I do not understand how to do all that.

In simple SQL, to fill the table with default values, all I need is just to omit the column name.

create table t( a char(1), b date default sysdate)
insert into t(a) values ('a') 

And the t.b would automatically get the today’s date.
How to do that task in BODS???


White Owl (BOB member since 2007-05-24)

What objects are you working with in Data Services to populate the target table?


eganjp :us: (BOB member since 2007-09-12)

Source is a table object (real Oracle table acting as stage).
Target is another table object, also real Oracle table, but with some columns having a default value defined for them. And on ETL these columns supposed to received the default values. Also, the target table have an auto-incremental field which acts as a surrogate primary key.
So the DataFlow looks like this:

Table(S) → Query → Table_Comparison → Map Operation → Key_Generation → Table(T).

It works if I put the default values in a Query for the fields. I do not want that, I want the Query (or some other DS’s object) to just use the defaults which are already set up on database side.

It would be also nice to get rid of the Key_Generation object since it is also a repeat of already existing auto increment on the database side.


White Owl (BOB member since 2007-05-24)