I have one ZTABLE (Source System) in my ECC system and MSSQL (Target System). In which I have one filed as Time Stamp. I am populating data like “2010.09.03 05:40:15” as Varchar format in this field.
I have developed one Initialization & Delta scenario as per Tutorial given in DI. I have global variable startime and endtime in same format (datetime). My Initialization job is working fine. I am facing problem in my Delta Job.
In query, I have added condition as per tutorial for Delta records with modification
so my above function should convert my ZTABLE → Timestamp field in date time format and it is. But the problem is, this condition is ignoring Minutes & second part.
Example: I am executing delta job at 12:00 PM and I have one record with time stamp of 12:15 PM in my ZTABLE, that data is not coming in my SQL Table. And If i change time stamp of this record with 01:15 PM, and if I execute Delta Job again then I am getting this record in SQL Table.
Can anyone know why its happening and how can I solve my problem?
are you able to read the varchar data from ZTABLE_TIMESTAMP column into a column with datatype DATETIME correctly ? for this in the Query transform change the column datatype to datetime and map the ZTABLE_TIMESTAMP to it (don’t use to_date, let DI do the implicit conversion)
if you are able to read this correctly, then use another query tansfrom to do the filtering using the datetime column to compare to datetime varaible, no need to use to_date if both are datetime type
have you tried by not using the to_date() function ? what is the datatype of Variable that you are comparing this column with ? do you get conversion warning when you validate the DF ?
I have changed datatype of ZTABLE -> TIMESTAMP filed of SQL into datetime from smalldatetime and wrote condition in where clause in smallcaps… and its works !!!
I have SAP ECC table as my source system and the table will have records in millions… I found from net that ABAP data flow is right way to extract data from ECC table… So i have try to implement same scenario with ABAP data flow… so my current scenario is like
Job → START_END_TIME, Data flow, UPDATE_CDC_TABLE
Data flow → ABAP Dataflow, Table Comparision, Target_SQL Table
ABAP dataflow → SAP_TABLE, Query, Data Transport file
My Query contains condition
While I have execute job… i got the error like
So I removed To_date function… And then again try with condition
After this I am able to execute my job without error… but I am not getting my delta records…
Am I missing some thing?? Or Is there any function which convert varchar to date time in ABAP data flow??
There isn’t a great deal that Data Services can do in ABAP dataflows. Even certain types of aggregation don’t seem to be possible. In the case that you are describing, I would usually extract all the data into a staging table and then apply the selection criteria. I realise that this isn’t very efficient but until DS can apply more functions inside ABAP dataflows we are a bit stuck.
That what i did right now. I have one concerned on this.
Assume that I have millions of records in my SAP tables. What will be better : ABAP Data flow or direct access? Because if I go for ABAP data flow, my delta job will going to fetch all records till Data Transport object and then again I have to filter records as per time stamp. While in direct I can directly filtered records as per time stamp.
I don’t know what should be select in such situation?
You raise an interesting point. I would favour direct access. Usually SAP don’t allow direct access (it’s unsupported!) and it becomes embedded as an architectural principle.
However, this may change with the Aurora release of Business Objects as Data Services will be able to access the BW extractors and, therefore, extract from SAP directly.
I am getting the SAME error. But, to my surprise i am not doing any varchar to datetime conversions in the Data flow .
With the ABAP dataflow i am getting the data to a FTP file with no issues. But from FTP to Target Sql Server Table, i am getting the following error.
(11.7) 02-21-11 15:24:47 (E) (31316:32520) DBS-070404:
|Dataflow DF_XXXX|Loader XXXXX_ SQL submitted to ODBC data source resulted in error <[Microsoft][ODBC SQL Server Driver][SQL Server]The conversion of a varchar data type to a datetime data type resulted in an out-of-range value.>.
I checked the warning messages while validationg the DF but couldnt find any varchar to date time conversions.
have a column in a table that has varchar dates in it “MM/DD/YY”
I need to convert it to a datetime.
I have tried convert(datetime,dte,1) and get date overflow. I have looked and search and can’t find anything that will fix it. I have also tried cast()