BusinessObjects Board

Named pipe error occurred in push-down

I can relate to all the causes raised in this thread - I am experiencing the very same problems, under the same conditions. (run as seperate process enabled, large data volumes = one of the sub data flows just falling over for some reason).

Any input on future fixes would be most appreciated.


ErikR :new_zealand: (BOB member since 2007-01-10)

Specifically for long running sub-data flow issues (the run-as-separate-process type of thing)… check these settings in your job server’s DSCONFIG file:

[AL_Engine]

DFRegistrationTimeoutInSeconds=300
NamedPipeWaitTime=100


dnewton :us: (BOB member since 2004-01-30)

Pff… Same issue @ BODS 12.2.2.3

Data transfers are NOT working:

  • Manually given transfer type is ignored. Although the Transfer Type is set to Table: BODS executes it as automatic which causes BODS to create the object in a random datastore with Automatic Data Transfer enabled.
  • The manually given name is ignored. The automatic generated tablenames are longer than 30 positions which isn’t possible in Oracle…

@Werner, please advice and solve this issue. The data transfer has been unreliable for many versions now.

Errors:
ORA-00972: identifier is too long
Named pipe error occurred:


BBatenburg :netherlands: (BOB member since 2008-09-22)

Curious, we’re not seeing this in 12.2.2.3. Have you tried deleting the Data Transfer step from the dataflow, saving, then adding it back?


dnewton :us: (BOB member since 2004-01-30)

issue “ORA-00972: identifier is too long” for DataTransfer transform is fixed in 12.2.3.0

I have to check the following whether this is an issue in 12.2.3.0 or not

  • Manually given transfer type is ignored. Although the Transfer Type is set to Table: BODS executes it as automatic which causes BODS to create the object in a random datastore with Automatic Data Transfer enabled.
  • The manually given name is ignored.

manoj_d (BOB member since 2009-01-02)

hi all,
It is a new dataflow which is still in development stage.

I checked the release notes:

"When a job contains a Data Transfer transform in a join query, the product
sometimes generated an Oracle alias with more than 30 characters, which
caused the job to fail with an Oracle error, “ORA-00972: identifier is too long”.
This issue has been resolved.
ADAPT01387906
"

This issue has been resolved indeed, (thanks manoj). It is unfindable in the sap notes using the search…

Other issue is still open. Tomorrow i am gonna try disabling Use collected statistics and setting the dataflow to pageable cache. I suspect BODS overruling the manually set transfer type when this is enabled…

Will keep you updated.


BBatenburg :netherlands: (BOB member since 2008-09-22)

Here is a flow a developer made which triggers the problem

Notes:

  • second Data transfer is disabled
  • Please dont mind the ‘naming conventions’…
    :roll_eyes:

Data transfer Transfer Type is Table. In the log however you see:
The Automatic Data_Transfer transform <STI_AGG_PREMIE_VERZ2_STI_AGG_P> has been resolved to a transfer object <DT__1595_7149_1_1(DSV_TRINICOM.DW_STI_TRINIC)>.

for unclear reason BODS handles the object “Data Transfer” as an automatic data transfer. It is not. When i disable the first DT the job never cancels with the messages: named pipe error/ identifier too long.

Relating topic:
https://bobj-board.org/t/152941

Werneers reply to this issue:
“In the meantime development read through the thread and believes you found a bug. An ADAPT case was created.”
However, also in 12.2.2.3 it still seems to exist.
dt.JPG
df.JPG
df_log.JPG


BBatenburg :netherlands: (BOB member since 2008-09-22)

In the attachement the reply from SAP support (26-04-10) regarding this issue.

This answer implies BODS adds a DT itself and the error isnt caused by the DT in te flow. I disagree with this conclusion, this doesn’t explain why the flow is succesfull when we disable the first DT.

What is your opinion on this?
sap_reply.JPG


BBatenburg :netherlands: (BOB member since 2008-09-22)

ok, you did file a support case for this, by any chance do you remember the incident number ? since you have provided the ATL and other details in the case, I can take the ATL and see what is the internal DT that DS creates and how removing the user added DT from the DF doesn’t causes this issue of ignoring the user set transfer type (Table)


manoj_d (BOB member since 2009-01-02)

Hi Manoj,

I have a message number: 322722 / 2010. Created 19.04.2010 - 9.21.11 CET

Thanks!


BBatenburg :netherlands: (BOB member since 2008-09-22)

The ATL that is attached to the incident is for the DF image that you have posted in the following post

you were having 2 issues
1 - ORA error, identifier too long - this is fixed in 12.2.3
2 - the transfer type is not used correctly

for the issue 2) from the ATL that is in the case, DS is using transfer type table for the Data transfer transform that is used in the dataflow (the trasfer type is set to table, and temp table specified)

but its also creating a another DTT (with transfer type table and generated table name) to load this temp table used in data transfer transform after reading the results from the source table

if you enable trace for reader and loader then you will see the SQL that is reading from this table and loading to temp table used in data transfer transform (either using INSERT … SELECT or BULK LOAD)

not sure why this optimisation is required and whether its atually optimised or not, have to check this

I am attaching modified DF screen shot of that DF, if you modify your DF and run that then you will not see DTT created by optimizer

for the DF that you have posted in this post I will need the ATL, either you can send that to me (mdhyani at hotmail ) or if you have any open DS case with support attach to that (give me the incident #)
df_modified.zip (16.0 KB)


manoj_d (BOB member since 2009-01-02)

Hi Manoj,

Your conclusion about the DF which was attached in the previous SAP note was indeed due to another DT which was added by BODS. However, i think this one is different.

In the dataflow (printscreen) which i attached strange thing that occured was:

  • Dataflow with DT all disabled ran fine
  • We enable the first data_transfer
    Result: bods automatically adds another DT. From there on it goes terribly wrong:
    • It chooses a random datastore which has Auto-DT enabled (differs for every run). Although the datastore has nothing to do with the job/df
    • It is unclear why/ when BODS suddenly adds the DT
    • (The sql which is being used isnt displayed in the validate sql)
    • (And ofcourse, the >30 chars in the tablename)

In the attachement the dataflow.
hoofdverz_atl.zip (37.0 KB)


BBatenburg :netherlands: (BOB member since 2008-09-22)

is the DF modifed ? If I run the DF I don’t see DS adding additional DTT, in both if I enable transfer is selected or not selected for Data_Transfer2 transform

in the screen shot, I see only one instance of ST_AGG_PREMIE_VERZ2 as source, but after importing the ATL I see 3 instance of ST_AGG_PREMIE_VERZ2 as source table


manoj_d (BOB member since 2009-01-02)

Hi manoj, Sorry you’re right. I made an adjustment of the ETL from the developer. Attached the original flow.

I did a retest.
When i disable the first DT: no automatic DT is added.
When i Enable the first DT: an automatic DT is added:
The Automatic Data_Transfer transform <STI_AGG_PREMIE_VERZ2_STI_AGG_P> has been resolved to a transfer object <DT__1642_7224_1_1(DSV_TRINICOM.DW_STI_TRINIC)>

Strange things:

  • Why suddenly add a DT?
  • Why pick the datastore DSV_trinicom? It plays no role in this dataflow/ job.
  • Validated SQL isn’t displayed for the DT’s (Validation >> display optimized sql)
  • (30 char problem doesn’t occur here)

Let me stress… the ETL which has been build in this example is far from optimal. However, the engine behaviour is strange IMO.


BBatenburg :netherlands: (BOB member since 2008-09-22)

the ATL is not attached, there are few cases in which DS will create auotmatic DTT, I am getting more information on those situation from the developer


manoj_d (BOB member since 2009-01-02)

:reallymad: third attempt…
(thanks for your patience :))
df_hoofdverz.zip (37.0 KB)


BBatenburg :netherlands: (BOB member since 2008-09-22)

I am attaching a doc with the explanation why DS Optimizer have to insert an automatic DTT and how to prevent that from happening
why_ds_inserts_auto_dtt_transform.zip (79.0 KB)


manoj_d (BOB member since 2009-01-02)

Manoj, i forgot to thank you for your extensive analysis. I now better understand the auto dtt behaviour.
Thanks!


BBatenburg :netherlands: (BOB member since 2008-09-22)

Have 12.1.1.0 and 12.2.0.0 installations and NamedPipeWaitTime does not appear at all in DSCONFIG. Is this setting only present in a newer version of DI?


jmcilvaine :us: (BOB member since 2007-06-14)

Have 12.1.1.0 and 12.2.0.0 installations and NamedPipeWaitTime does not appear at all in DSCONFIG. Is this setting only present in a newer version of DI?


jmcilvaine :us: (BOB member since 2007-06-14)