Hello all,
I’ve got a strange issue with a dataflow that does a number of validation transforms.
The dataflow basically has one table as a source (volume approx. 4,000) records.
Uses the validation transform a series of times as we want to output an individual error record for things such as duplicates, fields with nulls and fields with formatting issues (ie, non numeric). Hence there is a validation transform for each of the checks which passes the passed data onto the next validation transform. The data that passes all the checks gets outputted to another staging table.
The strange thing is that when the code is run in the dev environment, it flies. The volume and number of validation errors and passes is similar.
However when the exact dataflow is run in our test environment the job pretty much stalls. In the job monitor all the listed entries are set to proceed and stay there for up to an hour.
I’ve asked the dba’s to look at the database and they say that there’s hardly any load on the database. I have also looked at the cpu and memory usage and there’s nothing out of the ordinary. I have also run a trace with the query transform set and by the time it reaches the dataflow with the validations it just hangs with the initial entry of the dataflow starting and nothing more.
Furthermore, when I run the dataflow in debug mode, the code flies again. I think the issue is that for some reason on the Test DS Server, the code is being optimised differently to the dev server.
I’ve tried a number of things to the dataflow settings but I think the root cause has to be due to the optimisation of the code due to the better performance in debug mode.
Any suggestions would be most appreciated.
I’m on DS version 4.1. All databases are on Oracle 11g.
An addendum to this is that my dev server is on 4.1 SP1 and the Test server is on 4.1 SP3.
bods_style (BOB member since 2014-04-02)