Why do I get Error FIL-080101: Cannot open file?

Hi,

Most likely a user error, once again :(. I am using DS 12.2.0.0.

This questions is also posted in SDN http://forums.sdn.sap.com/thread.jspa?messageID=9673618&#9673618

I have created a ERP SAP datastore and imported one function, which is a simple BAPI. It takes 3 simple char parameters and returns a BAPIRET2 table.

I have a dataflow which contains:

  • An Excel source file with the input parameters
  • A query using the BAPI
  • A flat target file to contain the result

I have checked the following:

  • wiki https://wiki.sdn.sap.com:443/wiki/display/BOBJ/Shared+Directory+access
    –> I checked my Services logon properties for Data Services. The logon is set to the Administrator and not to local

  • SAP Note 1388857 (Error: “Cannot open file <>. Please check its path and permissions.” - Data Services XI 3.2 (12.2))
    –> I have no problem executing other jobs and reading the files. I only have the issue that when using the ABAP data flow :frowning:

It is not clear to me which users is trying to access the files. The SAP user? The Job Server? If the job server, I don’t understand why it has issues in this particular job when all others are fine.

Help appreciated.

Best Regards
Isabelle


isathore :india: (BOB member since 2010-10-11)

Isabelle, the information you provided is very confusing.
On the one had you state that you have no problems with files, just with the BAPI call. On the other hand, a BAPI call is not related to files.

Can we test step by step?

The first thing I would do is replicating the BAPI job but without the BAPI. You said you do not have problems with the files but when reading the text carefully, you never stated the target file can be written. So read the Excel file and write a file at the same location as the BAPI output, just copy the data. If you are right this dataflow will not have any issues.
If I am guessing right, the computer with the jobserver used, running under the local Administrator account does not have access to the target file. Either because it is a network path - local Admins have full access on local file but no permissions on network shares - or because the path does not exist on the jobserver computer.


Werner Daehn :de: (BOB member since 2004-12-17)

Hi Werner,

Let me state this again and try to be clearer.

I have created several dataflows NOT involving the use of BAPI functions in queries. In such data flow, I don’t experience file access errors.

I just started trying to use the DS functions on SAP servers. I created a data flow as mentioned in my post, which contains

  • An Excel source file
  • A query using a BAPI function for a ERP system
  • A flat target file

When I execute the job, I get the errors that I cannot access the source and target file. I don’t think I have mentioned that I have issues with the BAPI call itself. What I have mentioned, or at least tried to, is that I have file access issues with the source and target file which resides on the local machine, where the job server is also running.

What I don’t understand is when I run OTHER jobs which have source and target file in the same location, I don’t get file access errors.

I hope this clarifies what I was trying to explain.

I will try a further to see why I suddenly get those access errors while I don’t have it with other data flows using files in the same locations :(.

Thanks a lot & Best Regards
Isabelle


isathore :india: (BOB member since 2010-10-11)

Okay, makes sense.

  • What is the logon information (user) for the service?
  • What is the path of the source file according to the file reader. What I mean by that is go to the dataflow and open the reader object. Not editing the object from the object library. The object itself has a default path, the actual path used is defined in the reader.
  • Same thing for the loader.
  • Copy the exact error message you get
  • Proof to me that the reader/loader path exists on the server running the jobserver. Not the Designer computer.
  • Can you open the Excel file right now? Maybe somebody else has opened it, e.g. a still running/hung DS job?
  • Can you delete the target file - somebody might hold a lock there.

Werner Daehn :de: (BOB member since 2004-12-17)

Hi Werner,

As I expected, this was a user error :oops: and your detailed questions helped me identified the problems.

  • There was a slight typo in one of the files
  • The source file was indeed opened!

Thanks again. Sorry for such a stupid question!

Best Regards
Isabelle


isathore :india: (BOB member since 2010-10-11)

No problem. I am sure this thread will help others with similar questions.


Werner Daehn :de: (BOB member since 2004-12-17)

Hello,

I’m getting the same error FIL-080101 but when Data Services tries to reach its own index error file! :shock:

The error and index files are both created; and the error happens almost as soon as the first dataflow is launched; I tried tweaking the fileopen_retry_time as some other post suggested but I don’t think it is considered at all. I don’t think it is a security setting neither. Other jobs are successful from this same job server.

Error file content:
(12.2) 12-09-10 17:50:43 (E) (233980:0001) FIL-080101: Cannot open file /data2/dataservices/log/JS_QA_ETL434/workdata_fenix1000__ds_lcl_repo_etluser/error_12_09_2010_17_50_19_1__063d8802_3ffa_44a1_b00f_ec5c2dfc5efe.txt.idx in ‘rb’ mode. OS error message . OS error number <2>. al_engine reached ‘fileopen_retry_time’ limit so exiting

Index file content:
1291935043 1 0000000000 000557

I’m running Data Services 12.2.2.1 under AIX, we used this same box for running Data Integrator for a long time with no issues. We did a clean install using a new user and different path for the installation.

Any idea?
Thanks


butanski :mexico: (BOB member since 2009-05-08)

the cause for the error that you are getting is little different, there is some issue writing to job error log, this may happen if you have multiple DF running in parallel

are you running multiple DF in parallel ?
what is getting logged in error log ? are you getting lots of conversion warnings ?


manoj_d (BOB member since 2009-01-02)

Hi,

I got the same error message. I have several parallel running dataflows in the job. It was working fine for about 9 months now. But since this morning the job is hanging after executing some dataflows. Never errored out but from the log file this is what I see. Can anybody tell me why this started happening all of a sudden.

(11.7) 01-05-11 11:47:34 (E) (1748:000) FIL-080101: Cannot open file e:\apps\Business Objects\Data Integrator
11.7/log/ksoveiapp036_1/sedw__ttalexan/error_01_05_2011_11_21_26_1__42e64a18_e09c_4992_aecf_78db6738ee2b.txt.idx in ‘rb’ mode.
OS error message is:No such file or directory OS error number is:2 al_engine reached ‘fileopen_retry_time’ limit so exiting
(11.7) 01-05-11 12:03:44 (E) (1748:000) FIL-080101: Cannot open file e:\apps\Business Objects\Data Integrator
11.7/log/ksoveiapp036_1/sedw__ttalexan/error_01_05_2011_11_21_26_1__42e64a18_e09c_4992_aecf_78db6738ee2b.txt.idx in ‘rb’ mode.
OS error message is:No such file or directory OS error number is:2 al_engine reached ‘fileopen_retry_time’ limit so exiting

Thanks.


New2DI (BOB member since 2008-10-27)

Sorry, I never got a notification this question had a reply. The job only has one dataflow, but within it has many lookups -which I bet can be running in parallel. There are no conversion errors being logged.

I have tried to troubleshoot this extensively but I can’t pinpoint the issue. If I take out some of the lookups the job might work, but doesn’t looks it is deterministic, I can remove some and it will work, on a later time I remove the same and it won’t work… :hb:


butanski :mexico: (BOB member since 2009-05-08)

New2DI:
Did you ever get your problem resolved?

We had the same errors

438386 1 FIL-080101 2/8/2011 10:51:22 AM Cannot open file
438386 1 FIL-080101 2/8/2011 10:51:22 AM /bobjdi/dataservices/log/JSBOBJDEV2/michaela__michaela/error_02_08_2011_10_40_01_887__154ee9af_3754_4d93_afdd_80602eaa9300.txt.i
438386 1 FIL-080101 2/8/2011 10:51:22 AM dx in ‘rb’ mode. OS error message . OS error number <2>. al_engine reached ‘fileopen_retry_time’

But after the 4th attempt, the error disappeared. As much as I would like to ignore this error, I am worried after we go live the error reappears.

If anyone has found a root cause to this error, please share!!! :hb:

Thank you!

Moto


motomotosannn (BOB member since 2010-08-16)

I’m pursuing a solution with SAP support. I will post the outcome of this research as soon as there is one.


butanski :mexico: (BOB member since 2009-05-08)

Dear all:
I have submitted a ticket to SAP, and received below response:

[i]In reviewing the information you subimitted I noticed that your ulimit for memory is rather low - much lower then our recommendation.
Can you please modify that setting, restart job service and then see if
that affects the issue you are experiencing?

These are our recommended settings for AIX for the DS user account:

User resource limit Value Comments
file (blocks) 4194302 At least 2 GB
data (kbytes) unlimited
stack (kbytes) 512000 At least 500 MB
memory (kbytes) 2097151 At least 2 GB
nofiles (descriptors) 2000 At least 2000[/i]

I have applied the recommended ulimit settings for one of my job servers so I can compare the errors and see whether the above suggestion is the remedy.

Please feel free to try and see whether the ulimit setting helps fix the error.

Brian


motomotosannn (BOB member since 2010-08-16)

Hello,

The bug SAP created for my scenario was addressed under 12.2.3.2, this finally solved my issue. The bug description is very very vague, just something about AIX+order by=wrong SQL, but it did helped me (even when I don’t have an order by :shock: )

Good luck


butanski :mexico: (BOB member since 2009-05-08)

Could you please share the ADAPT Number if you know that?


ganeshxp :us: (BOB member since 2008-07-17)

I again apologize as I didn’t get a notification about your post. The ADAPT number and description is:

ADAPT01535325
In some cases, a Data Services job that uses order by and the lookup and nvl functions in a query mapping crashes on AIX . This issue has been fixed.

I know the description is quite vague but that is something the SAP engineer team decided.

You can find it under the Data Services Fix Pack 12.2.3.2 Release Notes.


butanski :mexico: (BOB member since 2009-05-08)

I am trying to bring in an Excel file using a path
\directory path\BO\Documents

I get an error message stating that I do not have permissions. I am able to copy a file to this directory without issue.

Do I need to mention a specifc Drive letter or some other syntax?

I believe taht my DS user does have access.


toscajo (BOB member since 2002-09-04)