Dear Experts, I have after some struggle finally setup the Rapid Mart on BO server. Connection is establised for ECC6.0 as the source system. My target is SQL Server 2005 database. I have DataStore as R3_DS and RM_DS as source and target. Now when I execute the Job, my data services job process is terminated by this error message
“12.2) 10-30-10 14:50:35 (E) (9084:9052) R3C-150607: |Data flow DF_FiscalPeriodNotYrSpecific_SAP
Execute ABAP program <C:/temp/ABAP_Programs/ZFISCPER.aba> error < Open File Error – C:\temp/dates.dat>.
(12.2) 10-30-10 14:50:35 (E) (2432:7396) R3C-150607: |Data flow DF_FiscalPeriodNotYrSpecific_SAP
Execute ABAP program <C:/temp/ABAP_Programs/ZFISCPER.aba> error < Open File Error – C:\temp/dates.dat>.”
Does anybody come across this and how it is resolved?
I have setup my data store as Direct download and specified local directories. Also i supplied the path for the dates.dat in the
Actually none of the Rapid Mart solutions are working for me. The four solution Cost Center, GL Account, Human Resource and Producton Planning (the ones that come with ABAP package) are all giving similar errors like the one i mentione.
The other ones are all giving me this error during the load and the load is terminated
“(12.2) 10-31-10 00:34:34 (0788:8296) WORKFLOW: Work flow <C_GLAccountByCompany_SAP> is completed successfully.
(12.2) 10-31-10 00:34:34 (0788:8296) WORKFLOW: Work flow <C_FiscalPeriod_SAP> is started.
(12.2) 10-31-10 00:34:34 (8088:7704) DATAFLOW: Process to execute data flow <DF_FiscalPeriodNotYrSpecific_SAP> is started.
(12.2) 10-31-10 00:34:36 (8088:7704) DATAFLOW: Data flow <DF_FiscalPeriodNotYrSpecific_SAP> is started.
(12.2) 10-31-10 00:34:36 (8088:7704) ABAP: ABAP flow <R3_FiscPer_NOT_YrSpecific> is started.
(12.2) 10-31-10 00:34:39 (8088:7704) DATAFLOW: Data flow <DF_FiscalPeriodNotYrSpecific_SAP> is terminated due to error <150607>.
(12.2) 10-31-10 00:34:39 (8088:7704) DATAFLOW: Process to execute data flow <DF_FiscalPeriodNotYrSpecific_SAP> is completed.
(12.2) 10-31-10 00:34:39 (0788:8296) WORKFLOW: Work flow <C_FiscalPeriod_SAP> is terminated due to an error <150607>.
(12.2) 10-31-10 00:34:39 (0788:8296) WORKFLOW: Work flow <WF_VendorItem_Dims_SAP> is terminated due to an error <150607>.
(12.2) 10-31-10 00:34:39 (0788:8296) WORKFLOW: Work flow <C_VendorItem_Section_SAP> is terminated due to an error <150607>.
(12.2) 10-31-10 00:34:39 (0788:8296) JOB: Job <Accounts_Payable_Load_SAP> is terminated due to error <150607>…”
I was under the impression that Rapid Mart should work out of the box. Please guide me in this
Actually I created this temp folder in C:\ drive and placed the dates.dat file in it. Then provided this path in Working Directory in SAP Server option in the R3_DS data store. I do see ZFISCPER.aba (18 KB’s) is created in the ABAP folder.
I need to create this folder C:\temp on SAP server and place dates.dat file in there
Then I will create another folder c:\temp\ABAP on the SAP server and place all related ABAP programs (that come with the 4 Rapid Mart’s) in there? Am I right in my understanding.
I think then I need to change my method to “Shared” and remove Direct Download?
Also I believe I will not have to create the sam folder on my computer and should not provide the path in the Data Store?
There are two steps here with ABAP Dataflow executions.
Mode of .abap execution option
a. Generate and Execute-Generates .aba file for every execution and the files are accessed by SAP through Function Modules which you upload on to SAP System through RFC though the location is outside SAP.
Usually the jobs are executed through a system account and it has access to this folder.
b. Execute Preloaded-Generates .aba files and we need to upload them to the SAP System. Then Data Services can call it when required.
.dat files created after abap execution can be accessed in multiple forms like ftp, shared folder, Direct Download depending on requirements.
That’s exactly my point. The aba files are never accessed by SAP. The abap program is generated by the engine, a copy of the generated text is saved as a file for backup by the engine, the text is passed as a table parameter to the RFC_ABAP_INSTALL_AND_RUN function module by the engine. SAP is not involved, does not have to have access to the file. Checkout the parameters for this function module. It does not ask for a file, the source code is passed into the function line by line into its table parameter.
I minor correction but the root cause of the confusion in this thread. Datafiles are shared, aba files are not.
I understand that we FTP dates.dat file to SAP Working Directory as part of configuration.
Also the ABAP files are FTP’ed to SAP Server for execution. So the port for FTP is opened on SAP ECC Server.
Can you help me understand how does the FTP work. How will the files be sent back to BODS Server, as either of the ports can be opened (BODS / SAP Server)
I thought that the Dat files created during the job execution,will be stored on the Working Directory only, and BODS will refer the Dat files as part of Extraction
Kindly provide your inputs…If you can share a Architecture diagram of how this will work, it will be great.
@Yogendra : I have the same environment and faced some issues regarding the file creation…Usually we can NFS mount windows disc on Unix or Unix on windows…If you are dealing with files greater than 1GB or so …while the SAP is writing onto the shared disc the file may be corrupted(strip off part of the file due to heavy network traffic) if windows disc is mounted on Unix …this leads to warnings and some times job fails…So make sure the NFS mount settings are set to maximum band width…or if you are setting it up newly…I would suggest to mount Unix on windows…which is a bit faster…
b. Execute Preloaded-Generates .aba files and we need to upload them to the SAP System. Then Data Services can call it when required.
I have exeucted the job in dev environment where the ABAP Execution Mode was taken as Generate and Execute.
however I am moving on to Test env and understand that I need to select Execute Preloaded. Can you / someone let me know which files to be moved on to the working directory on the SAP ECC Server.
Appreciate if you / someone can throw some light on steps to be taken care of while moving to next environment.