BusinessObjects Board

Querying/Joining BW Tables(DSOs) - Slow performance

When using DS to pull data from two BW Source tables, one transaction table and one parameter table using a non-equi join, performance of the DS job is extremely slow.
We use standard data flow(abap data flows not allowed )

Original Job Setup:
Step 1.1 Pull data from transaction table where update date is greater than last run date. Records generally in order of 1-2 million.

Parallel Step 1.2 - Pull fiscal_yr_month(YYYYMM) from parameter table

Sequential Step 2 Query transform to Join data from 1.1 and 1.2 such that only those records from 1.1 are passed where fiscal_yr_month greater than that in parameter table. In 1.1, fiscal_yr_month is part of the key and is a numeric.

Performance is way slow and job runs for around an hour. Job is clearly waiting for BW to provide data. Optimized SQL shows two queries one for transaction table and one for parameter table.

Variation that works:
If I do not pull the yr_month from parameter table and hard code it in Step 2, job runs much faster within minutes. Entire query is pushed down as 1 single query.

Only way I can automate this variation is to read the parameter into a file in a separate data flow, read the parameter from file into a variable in script task and than use this variale in my variation.[/color]

I expected the initial setup to push down the best possible query.

Any idea whats wrong with the initial setup. Not sure if I could explain well enough but any pointers will be good. My workaround works but I would like to keep things simpler and expect DS to push down the best possible query.


tomhawk_usa (BOB member since 2012-09-10)

I don’t think push-down handles what you want. I think you are correct that you will need to assign the parameter to a variable.

I have found through experience, that it is best to pass everything you want to be pushed down with variables. Also, make sure the variables have the right data types as implicit conversions often kill push-down.


Leigh Kennedy :australia: (BOB member since 2012-01-17)

Thanks for the response Leigh. We ended up pushing the filter logic to downstream application.

Bottom line: Joining two tables from BW using a standard data flow tends to run very slow.

Similar setup for rdbms, say oracle, works flawlessly,hence expected it to work in BW too. Not sure if ABAP data flow will improve the issue as we cannot used them. Would be glad if someone knows a way around this kind of issue as we are expecting a huge number of projects which will go after multiple BW tables.


tomhawk_usa (BOB member since 2012-09-10)

I believe 4.1 handles SAP tables differently, so it ‘might’ help.


Leigh Kennedy :australia: (BOB member since 2012-01-17)

Here’s what SAP have to say about extracting data:-
http://wiki.sdn.sap.com/wiki/display/EIM/Overview+SAP+Interfaces

In my experience, using a standard dataflow for SAP extraction is too slow. If you are just looking up a couple of rows you can just about get away with it. I am currently rewriting all my current project SAP extracts to use ABAP dataflows. Extract times have reduced by at least 50% in all cases.
What is stopping you from using ABAP dataflows?


Nemesis :australia: (BOB member since 2004-06-09)

If he is like me, change control issues.

My understanding is that Extractors are just ABAP programs that follow certain conventions. Extractors can be very fast or very slow, depending on how much effort has gone into them. So they shouldn’t be better or worse than ABAP data flows in most cases.

My understanding is how tables are accessed has changed between 4 and 4.1 . I’d be interested to heard from someone with real world experience of this.


Leigh Kennedy :australia: (BOB member since 2012-01-17)

Hi Leigh, we might be going off topic now but I’d be interested in knowing more about your change control issues. In the past, I have used the following system landscapes:-
Development - SAP application datastores have the ABAP execution option set to ‘Generate and Execute’.
Alll other systems (QA, PROD etc) have the ABAP execution option set to ‘Execute Preloaded’.
That way, there’s no way of inadvertantly modifying the ABAP code once it’s outside of development. This seem to satisfy the change control folks. What issues are your change control team raising?


Nemesis :australia: (BOB member since 2004-06-09)

I think Leigh is mixing R/3 flows with Extractors at the moment.

By the way, I created a seperate topic on RFC streaming in BODS 4.1:


Johannes Vink :netherlands: (BOB member since 2012-03-20)

Its not a technical issue, its more the attitude that any sap change needs business testing and sign-off. It is simply too expensive resource wise for us to do. Its an issue with our organisation, not the technology.

What you say is true, but out sap people done keep our dev and QA environments up to data. they say is is too costly to refresh them.


Leigh Kennedy :australia: (BOB member since 2012-01-17)

I think you are right, I have misunderstood. So reading tables is just as flaky as in 4.0 then ?


Leigh Kennedy :australia: (BOB member since 2012-01-17)

Well according to SAP it is working perfect. But with every technology there is a new limitation. So I am curious what the next limits are. And if there no drawbacks I will embrace this new feature fully :mrgreen:

But remember also the extractors. That was the new marketing talk. That was going to be a huge change. Well… it was, but it had a lot limitations.


Johannes Vink :netherlands: (BOB member since 2012-03-20)

I can sympathise. I’ve got the same issue at my current site. Our code runs like a dream in PROD but we have a nightmare getting it through QA as the environment contains any old rubbish that was relevant 3 years ago!


Nemesis :australia: (BOB member since 2004-06-09)