Extracting from SAP R/3

Hi we are trying to extract data from ECC while using BODS. Our BODS is on Windows NT where as SAP is on Unix. I have created SAP datastore and connectivity is fine as i can use small T tables as source in normal DF & extract data via RFC call.
While trying to get data from the bigger tables, i tried creating a ABAP dataflow and use share directory. I get the following error :

13364 14696 R3C-150607 2/20/2013 6:39:16 PM Execute ABAP program <//IP(BODS)/de0/BUT000.aba> error < Open File Error – //IP(SAP)/bodsdata/de0/BUT000.dat>.

First question, our hosting team decide to use NFS mount to map this directory between BODS & SAP. Is it possible to do this without using samba?

Second, if it is possible how to resolve this ?

I saw similar questions being answered but it was all with samba, for us it is not an option and hence writing this post.
Appreciate your time.


zuluking (BOB member since 2012-05-31)

Resolved, permissions were not setup correctly and path specified for working directory was not correct.


zuluking (BOB member since 2012-05-31)

Hi,

We are currently planning to deply GL RapidMart 3.2 and I am somewhat in the same situation

Unix Team have created a Working Directory and we are using NFS mount to map this directory between BODS & SAP

My doubt as below

We FTP dates.dat files from BODS to SAP ECC Working Directory.

I understand that .dat files are created on the Working Directory on SAP ECC Server. How does the data from .dat file gets loaded in the Target Table.

Are the .dat files FTP’ed to BODS Server from Source SAP ECC Server?

Also, appreciate if you can elaborate what permissions were missing in your case

Regards,
YR


yogendra (BOB member since 2004-10-19)

Yes, the .dat file is FTP over from application server to BODS job server.
When you create your ABAP dataflow, this .dat file is used as a source indirectly and is mapped to your target table in staging .

The permission issue was that the user under which the job service is running didnt have read/write permission on the SAP ECC server.


zuluking (BOB member since 2012-05-31)

Thanks for the information ZK.

In my case,

BODS Server: Windows
SAP Server: Linux
File Server: Unix

Folder created on File Server, Mount on SAP Server

I am using FTP as the Data Transfer Method

Executing the job, and getting the below error.

4192 4524 R3C-150607 3/1/2013 11:26:18 AM Execute ABAP program <D:/RapidMartShare/GetSAPVersion.aba> error < Open File Error –
4192 4524 R3C-150607 3/1/2013 11:26:18 AM //fileserver/app/datashare/RAPIDMart_GL/SAPVersion.dat>.

Can you assist in resolving this.

I have FTP User account which has read/write access to the folder on File Server. Do I need any more privileges.

Regards,
YR


yogendra (BOB member since 2004-10-19)

The trace log, it says ABAP started but never ABAP completed successfully, does it?

Then the case is simple. You tell you SAP basis team that the ABAP you are using cannot write into the file
//fileserver/app/datashare/RAPIDMart_GL/SAPVersion.dat

Please note, the “working directory on SAP server” of your SAP datastore is a directory as seen from the ABAP. Given your SAP Server is Linux, it would be a Linux path, not the Windows share!


Werner Daehn :de: (BOB member since 2004-12-17)

Hi,

I am attaching the screen print of the R3_DS configuration.

Appreciate if you can help me correct the same. I am executing the DS Job under the NT user id. Does this account need read/write access to the folder on the file server.

Many Thanks
Regards,
YR
R3_DS Config.jpg


yogendra (BOB member since 2004-10-19)

I am stating that in your trace log you will find the line that the ABAP got started but not an ABAP completed successfully line. Hence the problem is on the SAP side. Not related to the JobServer or JobServer computer.
Does the SAP system know the working directory (/RapidMart_GL/) and can it write into?


Werner Daehn :de: (BOB member since 2004-12-17)

Thanks for the clarity Werner. I am able to proceed further.

.dat files are being created on the Local Directory now. Data is being uploaded in the Target Tables

Fingers Crossed till the job executes completely

Regards,
YR


yogendra (BOB member since 2004-10-19)