Varchar to Datetime conversion

Hi All,

I have one ZTABLE (Source System) in my ECC system and MSSQL (Target System). In which I have one filed as Time Stamp. I am populating data like “2010.09.03 05:40:15” as Varchar format in this field.

I have developed one Initialization & Delta scenario as per Tutorial given in DI. I have global variable startime and endtime in same format (datetime). My Initialization job is working fine. I am facing problem in my Delta Job.

In query, I have added condition as per tutorial for Delta records with modification

so my above function should convert my ZTABLE → Timestamp field in date time format and it is. But the problem is, this condition is ignoring Minutes & second part.

Example: I am executing delta job at 12:00 PM and I have one record with time stamp of 12:15 PM in my ZTABLE, that data is not coming in my SQL Table. And If i change time stamp of this record with 01:15 PM, and if I execute Delta Job again then I am getting this record in SQL Table.

Can anyone know why its happening and how can I solve my problem?

Thanks in advance.

Rishit Kamdar


rishit.kamdar (BOB member since 2010-09-03)

‘YYYY.MM.DD HH24:MI:SS’


Werner Daehn :de: (BOB member since 2004-12-17)

Hi wdaehn,

Thanks for your earliest reply.

I did as you told but still its not working…

Even I have changed my SET_NEW_START_END_TIME script as

and my query condition

But still its truncating Minutes & Seconds part and I am not getting any records which has time stamp within one hour of my last delta job run time.

Example: If i have execute my delta job at 1:00:00 AM, in next delta job I am not getting data of 1:00:00 AM to 1:59:59 AM.

Once again thanks for your reply.

Rishit


rishit.kamdar (BOB member since 2010-09-03)

are you able to read the varchar data from ZTABLE_TIMESTAMP column into a column with datatype DATETIME correctly ? for this in the Query transform change the column datatype to datetime and map the ZTABLE_TIMESTAMP to it (don’t use to_date, let DI do the implicit conversion)

if you are able to read this correctly, then use another query tansfrom to do the filtering using the datetime column to compare to datetime varaible, no need to use to_date if both are datetime type

have you tried by not using the to_date() function ? what is the datatype of Variable that you are comparing this column with ? do you get conversion warning when you validate the DF ?


manoj_d (BOB member since 2009-01-02)

Hi Manoj,

Thanks for your reply.

I have changed datatype of ZTABLE -> TIMESTAMP filed of SQL into datetime from smalldatetime and wrote condition in where clause in smallcaps… and its works !!!

Once again Thank you and wdaehn for your reply…

Rishit Kamdar


rishit.kamdar (BOB member since 2010-09-03)

Hi…

I have SAP ECC table as my source system and the table will have records in millions… I found from net that ABAP data flow is right way to extract data from ECC table… So i have try to implement same scenario with ABAP data flow… so my current scenario is like

  1. Job → START_END_TIME, Data flow, UPDATE_CDC_TABLE

  2. Data flow → ABAP Dataflow, Table Comparision, Target_SQL Table

  3. ABAP dataflow → SAP_TABLE, Query, Data Transport file

My Query contains condition

While I have execute job… i got the error like

So I removed To_date function… And then again try with condition

After this I am able to execute my job without error… but I am not getting my delta records…

Am I missing some thing?? Or Is there any function which convert varchar to date time in ABAP data flow??

Once again I need your input on this…

Thanks in advace,

Rishit


rishit.kamdar (BOB member since 2010-09-03)

There isn’t a great deal that Data Services can do in ABAP dataflows. Even certain types of aggregation don’t seem to be possible. In the case that you are describing, I would usually extract all the data into a staging table and then apply the selection criteria. I realise that this isn’t very efficient but until DS can apply more functions inside ABAP dataflows we are a bit stuck.


Nemesis :australia: (BOB member since 2004-06-09)

Hi Nemesis,

That what i did right now. I have one concerned on this.

Assume that I have millions of records in my SAP tables. What will be better : ABAP Data flow or direct access? Because if I go for ABAP data flow, my delta job will going to fetch all records till Data Transport object and then again I have to filter records as per time stamp. While in direct I can directly filtered records as per time stamp.

I don’t know what should be select in such situation?

Once again thanks for your input.

Rishit


rishit.kamdar (BOB member since 2010-09-03)

You raise an interesting point. I would favour direct access. Usually SAP don’t allow direct access (it’s unsupported!) and it becomes embedded as an architectural principle.
However, this may change with the Aurora release of Business Objects as Data Services will be able to access the BW extractors and, therefore, extract from SAP directly.


Nemesis :australia: (BOB member since 2004-06-09)

Hello All,

I am getting the SAME error. But, to my surprise i am not doing any varchar to datetime conversions in the Data flow :hb: .

With the ABAP dataflow i am getting the data to a FTP file with no issues. But from FTP to Target Sql Server Table, i am getting the following error.

(11.7) 02-21-11 15:24:47 (E) (31316:32520) DBS-070404:
|Dataflow DF_XXXX|Loader XXXXX_ SQL submitted to ODBC data source resulted in error <[Microsoft][ODBC SQL Server Driver][SQL Server]The conversion of a varchar data type to a datetime data type resulted in an out-of-range value.>.

I checked the warning messages while validationg the DF but couldnt find any varchar to date time conversions.

Thank you in advance.


rrayanna (BOB member since 2010-07-12)

SQL Server cannot deal with dates < ‘1900.01.01’ whereas SAP can. So maybe you try to load the date value ‘0000.01.01’ into SQL Server?


Werner Daehn :de: (BOB member since 2004-12-17)

Hi to all,

In my DI Job the source is Text file . On that we have one datetime field column (Ex. 01/22/2011 09:11:06 PM ) .

Thru DI , i am able to load the data into Oracle, i use the date conversion in DI for that field as

to_date(field colum, ‘mm/dd/yyy hh:mi:ss’) as datetime .

But the output is getting as : 01/22/2011 09:11:06 AM

Instead of PM or AM in source the output is getting as opposite ie, AM or PM . Eventhough i gave the format mm/dd/yyyy hh24:mi:ss too…

Can anyone Please help me on this issue.

Thanks in Advance … Sureshchowdary


vasanthasuresh (BOB member since 2011-07-28)

edit: wrong answer.

Sorry.


Werner Daehn :de: (BOB member since 2004-12-17)

have a column in a table that has varchar dates in it “MM/DD/YY”
I need to convert it to a datetime.

I have tried convert(datetime,dte,1) and get date overflow. I have looked and search and can’t find anything that will fix it. I have also tried cast()


hermandez (BOB member since 2011-08-18)

what about the DataServices to_date() function as show above?

to_date(‘12/31/10’, ‘MM/DD/YY’)


Werner Daehn :de: (BOB member since 2004-12-17)