Hello,
I have a DF with two source tables into a merge. Then this merge goes to another merge with another source table and then finally to the output.
Table A has 240416 rows
Table B has 41126 rows.
JOIN condition
Table A left outer join Table B on A.column1 = B.column4
Table C has 600 rows.
But the graph takes almost 50 minutes to run. Its the first part of the graph that has the issue, as i tested by removing Table C and running the graph with the same 50 min execution time. All i intend to do is to replace a column in A if i find a match in B.
If the join is pushed down to the database then it could be a lack of statistics that results in the DBMS making a less than optimal query plan. I’ve seen Oracle do this.
Jim,
I am using sql server as the under lying DB engine. Please let me know, if there is a way to shorten the execution time. The main production tables run into million records and that would exponentially increase the time.