I have a couple of issues with the BODS (Version 4.1 SP1 Patch 5).
1-- The designer crashes if I try to make a change in a dataflow. This is not the same for all the dataflows. There is only one data flow which results in designer crash (this has several lookups and lookups within them). Did any one face this and were able to solve this.
Note: I observed the same issue with different versions when you have try catch blocks many of em in a single job. I was able to over come it by keeping the open windows within the designer to the minimum(do not open too many windows within the designer which are try/catch ).
2-- I have installed 4.1 SP1 patch 2 and I tested export execution command to output a batch file and made sure this batch file triggers the respective JOB. But after the patch 5 upgrade, I’m not able to run the batch file. Below is the error I get, I have seen this in other topic which did not resolve my issue. Please let me know if any one faced and solved. Thanks in advance.
ORB::BOA_init: hostname lookup returned ‘localhost’ (127.0.0.1/::1)
Use the -OAhost option to select some other hostname
Thanks for the reply folks. Yes, I did a copy paste. Probably this is the third copy of the data flow. My observation while running this data flow are.
C:\ drive has 16 Gig memory when the job starts it goes down to 1gb.
Ram is 8 Gig which is totally under use when I run this DF.
Dataflow: This data flow reads 2 tables one from oracle and another from sql server. Stages on SQL server and has about 100 lookups which are compulsory within the same DB. The volume processed through these lookups is about 300000. I’m pretty sure this memory usage and the pageable file created in C are due to these many lookups. For now I have split the data flow in 3 different so that the memory consumption and the pageable file is low. But in future if I have such annoying logics to be deployed may I know, on how to tune the performance.
eganjp,
SAP DS is on Microsoft Windows server R2 2008.
Can you please explain how can I assure the solid connection to the DB. As I badly need it, every now and then I find issues related to DB.
Your memory consumption is most likely due to the join that is taking place on the Job Server. Whenever data is coming from different physical databases the join will happen on the Job Server.
The 100 lookups could be part of the problem also, but only if each of those lookups cache a large number of rows.
Connecting from Designer to a repository over a VPN or WAN connection is often detrimental to the stability of Designer. I find that developing using a repository stored on a local database (that isn’t connected to the job server) is very reliable. However, not having the repository connected to the job server makes it hard to run/test jobs.
Thanks Egan. Sorry for the delayed response. I was able to accommodate the job at that moment. But all the tables used here are from a single DB Server where we stage and then transform as recommended. Probably because the dataflow is dealing with 170 fields and in which 100 are lookups within lookups . …Its running fine so far without memory issues as I adjusted the DOP and some processes to run separately. The job runs slow but does not over burden the Job server as of now.