I am using the Date Generation transformation to populate a new date dimension. The available date functions with DS are not sufficient to calculate all the attributes of my dimension, so I am trying to use the sql() function for the remaining attributes. The problem is I do not know how to pass the transformation generated date value DI_GENERATED_DATE to the sql function.
As an example, I want to derive the weekday name for the generated date:
My input schema is Date_Generation and simply passing the parameters as defined above generates an error (the multi-part identifier Date_Generation.DI_GENERATED_DATE could not be bound).
How can I access the value of DI_GENERATED date within sql() functions?
Hi,
The sql() transformation calls objects from the DB, I don’t think you can call specific BODS functions from here.
What are the attributes you need for your date dimension?
There are several date attributes I’d like to add to my date dimension, that are not available via the standard date functions within DS. For example, Weekday Name, Month Name, etc. All other attributes I’d like are available via SQL Server functions datepart, datename, dateadd.
I’d like access to the generated date from the transform, otherwise I am seeding my date dimension via a stored procedure or CTE.
If you want to use the Date_Generation transform and get the weekday name, this is possible by using the day_in_week() and decode() function and setting the names based on the day numbers (optionally using a lookup on a parameter table).
Idem for the month name. The to_char() function also returns the month name.
Another way is to use a SQL transform - not a sql() function - as a source and code your “date generation” with SQL Server functions. This implies using a row generation syntax in T-SQL.
To pass the generated date value to the database through the sql() function is like this:
sql("DatastoreName’,‘SELECT somefunction(’’ || to_char(query.datecolumn, ‘YYYYMMDD HH24:MI:SS’) || ‘’) FROM DUAL’)
Keep in mind that using the sql() function can greatly slow down your Dataflow. But if this is a one time, or once a year operation then it’s no big deal.