Problem loading RM's

Dear Experts, I have after some struggle finally setup the Rapid Mart on BO server. Connection is establised for ECC6.0 as the source system. My target is SQL Server 2005 database. I have DataStore as R3_DS and RM_DS as source and target. Now when I execute the Job, my data services job process is terminated by this error message

“12.2) 10-30-10 14:50:35 (E) (9084:9052) R3C-150607: |Data flow DF_FiscalPeriodNotYrSpecific_SAP
Execute ABAP program <C:/temp/ABAP_Programs/ZFISCPER.aba> error < Open File Error – C:\temp/dates.dat>.
(12.2) 10-30-10 14:50:35 (E) (2432:7396) R3C-150607: |Data flow DF_FiscalPeriodNotYrSpecific_SAP
Execute ABAP program <C:/temp/ABAP_Programs/ZFISCPER.aba> error < Open File Error – C:\temp/dates.dat>.”

Does anybody come across this and how it is resolved?

I have setup my data store as Direct download and specified local directories. Also i supplied the path for the dates.dat in the

Actually none of the Rapid Mart solutions are working for me. The four solution Cost Center, GL Account, Human Resource and Producton Planning (the ones that come with ABAP package) are all giving similar errors like the one i mentione.

The other ones are all giving me this error during the load and the load is terminated

“(12.2) 10-31-10 00:34:34 (0788:8296) WORKFLOW: Work flow <C_GLAccountByCompany_SAP> is completed successfully.
(12.2) 10-31-10 00:34:34 (0788:8296) WORKFLOW: Work flow <C_FiscalPeriod_SAP> is started.
(12.2) 10-31-10 00:34:34 (8088:7704) DATAFLOW: Process to execute data flow <DF_FiscalPeriodNotYrSpecific_SAP> is started.
(12.2) 10-31-10 00:34:36 (8088:7704) DATAFLOW: Data flow <DF_FiscalPeriodNotYrSpecific_SAP> is started.
(12.2) 10-31-10 00:34:36 (8088:7704) ABAP: ABAP flow <R3_FiscPer_NOT_YrSpecific> is started.
(12.2) 10-31-10 00:34:39 (8088:7704) DATAFLOW: Data flow <DF_FiscalPeriodNotYrSpecific_SAP> is terminated due to error <150607>.
(12.2) 10-31-10 00:34:39 (8088:7704) DATAFLOW: Process to execute data flow <DF_FiscalPeriodNotYrSpecific_SAP> is completed.
(12.2) 10-31-10 00:34:39 (0788:8296) WORKFLOW: Work flow <C_FiscalPeriod_SAP> is terminated due to an error <150607>.
(12.2) 10-31-10 00:34:39 (0788:8296) WORKFLOW: Work flow <WF_VendorItem_Dims_SAP> is terminated due to an error <150607>.
(12.2) 10-31-10 00:34:39 (0788:8296) WORKFLOW: Work flow <C_VendorItem_Section_SAP> is terminated due to an error <150607>.
(12.2) 10-31-10 00:34:39 (0788:8296) JOB: Job <Accounts_Payable_Load_SAP> is terminated due to error <150607>…”

I was under the impression that Rapid Mart should work out of the box. Please guide me in this


azid123 (BOB member since 2010-11-01)

Please look into the files’ folder where you are creating the .aba files for the ABAP or R/3 dataflows. It seems that it is file access issue.

C:\temp/dates.dat may not be a valid path.

Check if the .aba is created, the check whether the abap program is executed and after that check for the creation of the required .dat file created.

By the way what is the Rapid Mart you are trying to execute and what is its version.


SantoshNirmala :india: (BOB member since 2006-03-15)

Thanks Nirmala,

Actually I created this temp folder in C:\ drive and placed the dates.dat file in it. Then provided this path in Working Directory in SAP Server option in the R3_DS data store. I do see ZFISCPER.aba (18 KB’s) is created in the ABAP folder.

Kindly guide me thru …


azid123 (BOB member since 2010-11-01)

You need to upload the file to the SAP server.


Werner Daehn :de: (BOB member since 2004-12-17)

Just clarify,

I need to create this folder C:\temp on SAP server and place dates.dat file in there
Then I will create another folder c:\temp\ABAP on the SAP server and place all related ABAP programs (that come with the 4 Rapid Mart’s) in there? Am I right in my understanding.

I think then I need to change my method to “Shared” and remove Direct Download?

Also I believe I will not have to create the sam folder on my computer and should not provide the path in the Data Store?

Please correct me if I’m wrong.


azid123 (BOB member since 2010-11-01)

Forget about the ABAP programs, they are uploaded via RFC and not as files.


Werner Daehn :de: (BOB member since 2004-12-17)

There are two steps here with ABAP Dataflow executions.

  1. Mode of .abap execution option
    a. Generate and Execute-Generates .aba file for every execution and the files are accessed by SAP through Function Modules which you upload on to SAP System through RFC though the location is outside SAP.
    Usually the jobs are executed through a system account and it has access to this folder.

    b. Execute Preloaded-Generates .aba files and we need to upload them to the SAP System. Then Data Services can call it when required.

  2. .dat files created after abap execution can be accessed in multiple forms like ftp, shared folder, Direct Download depending on requirements.


SantoshNirmala :india: (BOB member since 2006-03-15)

That’s exactly my point. The aba files are never accessed by SAP. The abap program is generated by the engine, a copy of the generated text is saved as a file for backup by the engine, the text is passed as a table parameter to the RFC_ABAP_INSTALL_AND_RUN function module by the engine. SAP is not involved, does not have to have access to the file. Checkout the parameters for this function module. It does not ask for a file, the source code is passed into the function line by line into its table parameter.

I minor correction but the root cause of the confusion in this thread. Datafiles are shared, aba files are not.


Werner Daehn :de: (BOB member since 2004-12-17)

Hi,

Trying to understand how does RM work.

I understand that the .Dat files are created on Working Directory on SAP Server.

How is the data from these dat files loaded to Target Tables.

Are the .dat files FTP’ed back to Job Server local folder?

Trying to understand how this works.

Regards,
YR


yogendra (BOB member since 2004-10-19)

There are three options like

  1. Getting ftp’ed on to job server machine.
  2. Put them in shared directory
  3. Direct download

Once .dat file is generated data is read from the file for
Further processing.


SantoshNirmala :india: (BOB member since 2006-03-15)

Thanks for the information Santosh.

I understand that we FTP dates.dat file to SAP Working Directory as part of configuration.

Also the ABAP files are FTP’ed to SAP Server for execution. So the port for FTP is opened on SAP ECC Server.

Can you help me understand how does the FTP work. How will the files be sent back to BODS Server, as either of the ports can be opened (BODS / SAP Server)

I thought that the Dat files created during the job execution,will be stored on the Working Directory only, and BODS will refer the Dat files as part of Extraction

Kindly provide your inputs…If you can share a Architecture diagram of how this will work, it will be great.

Regards,
YR


yogendra (BOB member since 2004-10-19)

If you are already doing a NFS mount why not look for shared directory rather than FTP as the process is much quicker .


zuluking (BOB member since 2012-05-31)

Hi ZK,

Thanks for yoru response.

I have a Heterogenous environment

SAP on Unix and BODS on Windows. Is it suggested to use Shared Directory option

Did you use Shared Directory option in your case?

Thanks,
YR


yogendra (BOB member since 2004-10-19)

Yes shared directory works as you are doing almost the same setup for FTP with the NFS mount. No need of FTP server as well.


zuluking (BOB member since 2012-05-31)

@Yogendra : I have the same environment and faced some issues regarding the file creation…Usually we can NFS mount windows disc on Unix or Unix on windows…If you are dealing with files greater than 1GB or so …while the SAP is writing onto the shared disc the file may be corrupted(strip off part of the file due to heavy network traffic) if windows disc is mounted on Unix …this leads to warnings and some times job fails…So make sure the NFS mount settings are set to maximum band width…or if you are setting it up newly…I would suggest to mount Unix on windows…which is a bit faster…


Randy_KP (BOB member since 2012-10-23)

@Santosh

b. Execute Preloaded-Generates .aba files and we need to upload them to the SAP System. Then Data Services can call it when required.

I have exeucted the job in dev environment where the ABAP Execution Mode was taken as Generate and Execute.

however I am moving on to Test env and understand that I need to select Execute Preloaded. Can you / someone let me know which files to be moved on to the working directory on the SAP ECC Server.

Appreciate if you / someone can throw some light on steps to be taken care of while moving to next environment.

Regards,
YR


yogendra (BOB member since 2004-10-19)

http://wiki.sdn.sap.com/wiki/display/EIM/Moving+ABAP+to+Production+(DI+12.1)


Werner Daehn :de: (BOB member since 2004-12-17)

Thanks Werner, I am more clear now.

Let me create the ABAP programs and try to execute the job using ABAP Execution option as “Execute Preloaded”

Hope this should do the needful.

Regards,
YR


yogendra (BOB member since 2004-10-19)