I have simple stored procedure that selects records from table A(temp table)no indexes and insert into target table B(10 or more indexes). If i execute this stored procedure in sql developer using begin stp() end; it takes 25 seconds to load 200k records. However, same stored procedure called from data services using script sql(‘ds_name’, ‘stp’); it takes 16 minutes.
Every time the job is run, temp table is truncated and reloaded with new data. I use insert into insert /*+ append */ into B (a.b.c.d)
select a, b. c. d from A.
Also, if i call this stored procedure a test job it runs in 25 seconds, when it is made as part of other 200 DF, it takes 16 minutes.
You also mentioned putting it in a test job. Was it all on its own, or inside a workflow, or inside a dataflow?
And when you run it in with the other 200 dataflows is it inside one of the dataflows, or in a workflow between two dataflows, or in the job between two workflows?
Whenever I’ve had to run a procedure I put it inside a workflow but outside (usually between) dataflows. Not for any particular reason, just that it seems to fit better there for my logic.
Just sharing some additional Information for all, I am currently using DS 4.2 Sp6 P2 and use the store procedure call all of the time.
Process is used to perform re-indexing to other tasks with MS SQL server.
Process is done in a daily basis for transnational updates and based on testing the run time are for the most part about the same from directly in SQL server and from DS.
There is somewhat a quicker response on the database side but that is expected.
Here is a example that is used from DS when our daily process is complete when reading and loading flat files into DS, the last process is a DS Script with the call from DS.