We have recently upgraded to BODS 4.1 version from 3.2 and we are facing an issue while running a job which worked fine in 3.2 version. I tried to search for this kind of issue in this forum but didn’t get much help from the past posts. So, am directly posting my issue here. Can anyone kindly look into this issue and enlighten me.
Issue: The database client cannot connect to named pipe Error
Error Message:
|SubDataflow TOTM1_OCE_FACTS_TM1_BUDGETS_copytest_1_1|Pipe Listener for IPCTarget1_computeKey-Function24_common
The database client cannot connect to named pipe <\.\pipe\55ba26b5-0dde-4933-9613-5c51f5060bbc>.
|SubDataflow TOTM1_OCE_FACTS_TM1_BUDGETS_copytest_1_1|Pipe Listener for IPCTarget2_computeKey-Function24_common
The database client cannot connect to named pipe <\.\pipe\2faaaac9-f79c-4101-acb4-87f5b8706fe5>.
|Session TEST_OCE|Data flow TOTM1_OCE_FACTS_TM1_BUDGETS_copytest Data flow <TOTM1_OCE_FACTS_TM1_BUDGETS_copytest> received a bad system message. Message text from the child process is <ᅬ
==============================================
Collect the following and send to Customer Support:
Log files(error_, monitor_, trace_*) associated with this failed job.
Exported ATL file of this failed job.
DDL statements of tables referenced in this failed job.
Data to populate the tables referenced in the failed job. If not possible, get the last few rows (or sample of them) when the job failed.
Core dump, if any, generated from this failed job.
==============================================>
The process executing data flow <TOTM1_OCE_FACTS_TM1_BUDGETS_copytest> has died abnormally. For NT, check errorlog.txt. For HPUX, check stack_trace.txt. Also, notify Technical Support.
I’m using a data flow which has lookup functionality. It doesn’t have any groupby, run as a separate process but still this issue occurs. Few of the columns use leftpad and cast functions as shown:
lpad( cast( F_OCE_BUDGETS_V2.MONTH_KEY, ‘varchar(7)’ ), 6, ‘0’ )
but I’m not sure whether this is causing the issue since this functionality worked fine in BODS 3.2 version.
I’m not even sure whether this is a BODS or database related issue. Can someone please look into this issue and suggest a possible solution.
Before you get too wrapped up in the code, go into the DSConfig.txt file (found in LINK_DIR\conf) and change GLOBAL_DOP to 1. 4.x changed the default from 1 to 2. ETL jobs that weren’t written to work in a DOP > 1 environment may not work quite right.
Then rerun the job. You could restart the job server, but I don’t think that is necessary to make the property active.
We are using Oracle 11g database. A temp table (which isn’t used elsewhere) is used as a target in this particular data flow ‘TOTM1_OCE_FACTS_TM1_BUDGETS_copytest’ which doesn’t have any bulk load option selected.
We are not using any GroupBy, OrderBy functionality. As previously mentioned ‘Run as a separate Process’ is also not selected (nothing is selected in the Advance tab).
No Ganesh, We didn’t use any Data Transfer Transform in this data flow.
For your reference, I have attached a document which displays the objects inside the data flow.
The ETL job server and Database server (Source & Target tables) use the same OS i.e. Windows Server 2008 R2 but I think they reside on two different systems.
Just to confirm something mentioned earlier: Are you sure the “Run as separate process” box isn’t checked in the lookup in the first Query transform (I’m assuming it’s doing a lookup, based on the name)?
No, I confirm that ‘Run as a separate process’ option isn’t checked. I’ve attached the related screen shots in the document ‘data_flow_objects_updated.doc’.
Not to beat a dead horse, but check inside the lookup function itself, as shown on the attached pdf. In the upper right corner of the function dialog is a “Run as a separate process” box. Document1.pdf (91.0 KB)
That setting is rarely ever needed in DS 3.x and beyond. In fact, in my code review check list I call that out as one of the things to look for. Anyone using run as a separate process has to justify to the review group WHY they are using it.
The issue got resolved. I just unchecked the ‘Run as a separate Process’ option in the lookup_ext function whereever it was used in the job and it worked, the job ran fine without any errors or warnings.