The database client cannot connect to named pipe Error

hi friends,

We have recently upgraded to BODS 4.1 version from 3.2 and we are facing an issue while running a job which worked fine in 3.2 version. I tried to search for this kind of issue in this forum but didn’t get much help from the past posts. So, am directly posting my issue here. Can anyone kindly look into this issue and enlighten me.

Issue: The database client cannot connect to named pipe Error

Error Message:
|SubDataflow TOTM1_OCE_FACTS_TM1_BUDGETS_copytest_1_1|Pipe Listener for IPCTarget1_computeKey-Function24_common
The database client cannot connect to named pipe <\.\pipe\55ba26b5-0dde-4933-9613-5c51f5060bbc>.
|SubDataflow TOTM1_OCE_FACTS_TM1_BUDGETS_copytest_1_1|Pipe Listener for IPCTarget2_computeKey-Function24_common
The database client cannot connect to named pipe <\.\pipe\2faaaac9-f79c-4101-acb4-87f5b8706fe5>.
|Session TEST_OCE|Data flow TOTM1_OCE_FACTS_TM1_BUDGETS_copytest Data flow <TOTM1_OCE_FACTS_TM1_BUDGETS_copytest> received a bad system message. Message text from the child process is <&#65487;
==============================================
Collect the following and send to Customer Support:

  1. Log files(error_, monitor_, trace_*) associated with this failed job.
  2. Exported ATL file of this failed job.
  3. DDL statements of tables referenced in this failed job.
  4. Data to populate the tables referenced in the failed job. If not possible, get the last few rows (or sample of them) when the job failed.
  5. Core dump, if any, generated from this failed job.
    ==============================================>

The process executing data flow <TOTM1_OCE_FACTS_TM1_BUDGETS_copytest> has died abnormally. For NT, check errorlog.txt. For HPUX, check stack_trace.txt. Also, notify Technical Support.

I’m using a data flow which has lookup functionality. It doesn’t have any groupby, run as a separate process but still this issue occurs. Few of the columns use leftpad and cast functions as shown:
lpad( cast( F_OCE_BUDGETS_V2.MONTH_KEY, ‘varchar(7)’ ), 6, ‘0’ )
but I’m not sure whether this is causing the issue since this functionality worked fine in BODS 3.2 version.

I’m not even sure whether this is a BODS or database related issue. Can someone please look into this issue and suggest a possible solution.

Thanks in advance,
Nova


novasuper007 (BOB member since 2010-02-25)

Before you get too wrapped up in the code, go into the DSConfig.txt file (found in LINK_DIR\conf) and change GLOBAL_DOP to 1. 4.x changed the default from 1 to 2. ETL jobs that weren’t written to work in a DOP > 1 environment may not work quite right.

Then rerun the job. You could restart the job server, but I don’t think that is necessary to make the property active.


eganjp :us: (BOB member since 2007-09-12)

Thanks for the reply Egan,

But I’ve checked the GLOBAL_DOP and its already set to 1. The issue still exists.


novasuper007 (BOB member since 2010-02-25)

Novasuper, let me know what DBMS this is?

We can add up a parameter for named pipe. Are you doing bulk load?


ganeshxp :us: (BOB member since 2008-07-17)

Hi Ganesh,

We are using Oracle 11g database. A temp table (which isn’t used elsewhere) is used as a target in this particular data flow ‘TOTM1_OCE_FACTS_TM1_BUDGETS_copytest’ which doesn’t have any bulk load option selected.

Thanks,
Nova


novasuper007 (BOB member since 2010-02-25)

Also, check the transforms in the Dataflow to see if any of them have the “Run as a separate process” turned ON. If so, turn them OFF.


eganjp :us: (BOB member since 2007-09-12)

What about the sub-dataflow? You have a Data Transfer Transform right? What option is the DT in? Automatic/File/Table?


ganeshxp :us: (BOB member since 2008-07-17)

No Egan,

We are not using any GroupBy, OrderBy functionality. As previously mentioned ‘Run as a separate Process’ is also not selected (nothing is selected in the Advance tab).

No Ganesh, We didn’t use any Data Transfer Transform in this data flow.

For your reference, I have attached a document which displays the objects inside the data flow.

Thanks,
Nova
data_flow_objects.doc (29.0 KB)


novasuper007 (BOB member since 2010-02-25)

Are your ETL Job Server and database server located on the same O/S?

When I say database server this could be either the source or target tables as seen in your Dataflow or it could the repository database.


eganjp :us: (BOB member since 2007-09-12)

Hi Jim,

The ETL job server and Database server (Source & Target tables) use the same OS i.e. Windows Server 2008 R2 but I think they reside on two different systems.

Thanks,
Nova


novasuper007 (BOB member since 2010-02-25)

Just to confirm something mentioned earlier: Are you sure the “Run as separate process” box isn’t checked in the lookup in the first Query transform (I’m assuming it’s doing a lookup, based on the name)?


ht1815 (BOB member since 2008-05-23)

hi ht1815,

No, I confirm that ‘Run as a separate process’ option isn’t checked. I’ve attached the related screen shots in the document ‘data_flow_objects_updated.doc’.

Thanks,
Nova
data_flow_objects_updated.doc (182.0 KB)


novasuper007 (BOB member since 2010-02-25)

Not to beat a dead horse, but check inside the lookup function itself, as shown on the attached pdf. In the upper right corner of the function dialog is a “Run as a separate process” box.
Document1.pdf (91.0 KB)


ht1815 (BOB member since 2008-05-23)

Thanks ht1815,

Now I get it, ‘Run as a separate process’ option is checked in the Lookup functions. I’ll uncheck and run the job. I will let you know the result.

Thanks a ton,
Nova


novasuper007 (BOB member since 2010-02-25)

That setting is rarely ever needed in DS 3.x and beyond. In fact, in my code review check list I call that out as one of the things to look for. Anyone using run as a separate process has to justify to the review group WHY they are using it.


eganjp :us: (BOB member since 2007-09-12)

I agree with Jim.

Most of the time it is enabled because of memory problems. But ticking that box will not reduce the amount of memory used… hence a redesign is better.

Beside the little fact that run-as-a-seperate-process causes more problems in general…


Johannes Vink :netherlands: (BOB member since 2012-03-20)

Thanks you all for your support,

The issue got resolved. I just unchecked the ‘Run as a separate Process’ option in the lookup_ext function whereever it was used in the job and it worked, the job ran fine without any errors or warnings.

Thanks again,
Nova


novasuper007 (BOB member since 2010-02-25)