Hi,
I have developped an ETL with DI11.7.2 on Windows Xp professionnal on my laptop (CoreDuo,2G of Memory,2M cache) and my ETL had made 7 or 9 min max.
Now i have deployed the developpement on an other server more performant (Solaris10,4G of Memory,Bi Proc 1,6Ghz ) but the loading had made 30 min to be complete
Your laptop might be a 3.2GHz CPU which is twice as fast just looking at the clock speed, Intel is CISC vs. Sparc a RISC CPU,… So if your job consumes lots of CPU it is possible.
The Unix platforms are good in parallel processing and disk I/O, that’s why they usually get better numbers. Why don’t you run the benchmark mentioned in my Signature below to see how you compare. There are Solaris numbers as well. And if you post the results, we can discuss the findings.
As source i have flat files, but the target is a database oracle10g and it is the same thing on my laptop and on the server.
already the job server and the database are in the same machine.
No my laptop CPU is 2Ghz, for the usage of CPU during running the loading in my laptop it is come to 100% but in the solaris server the al_engine and the al_job_server don’t excess the 4,7%
Where is the location of the source flat file? Is it on the network or local to the job server? It should be the latter.
You can also try putting a map_operation before your target table, and setting all of the rows to Discard. This will help tell you if the bottleneck is in the loading of the target database, or someplace earlier in the process (reading the data, or transforming it in query lookups).
Yes the flat files are on the same location of the job server.I indicate that the part that takes long time is loading and moving the files.
I even try to use the API BoalkLoader in the option of the target table but there no difference!