BO Runtime Performance

X-cc: “Dann, John” JDann@medspan.com,
“Michaud, Jim” Jmichaud@medspan.com

Hi Everyone,

I need to pick some brains. I am trying to improve the performance of running Business Objects reports (Version 4.1) on MS SQL Server 6.5. Some of the reports take 15 minutes to run and others take 1 second. Does anyone have any ideas on how to improve the speed of the reports through Business Objects? through MS SQL Server? It seems like some calculations on the BO reports take forever. Any information, tips or hints would be a great help.

Rob Daddona - MedSpan Inc.


Listserv Archives (BOB member since 2002-06-25)

Rob,

Try timing the SQL of the report. In other words, get your DBA to run the SQL against the database and see how long it takes to return the data. This way you can isolate where the performance problem is: in the database/SQL or in the calculations of Business Objects.

Good Luck!
Jodie


Listserv Archives (BOB member since 2002-06-25)

What are the differences between what the reports are retrieving? You may find that the ones that take 15+ minutes have more constraints than the 1 second reports. Other things to check 1. Rows that are being accessed per report. 2. Indexes, check to make sure indexes are where needed on the tables.
Check against you where clause(s)
3. Time of day reports are running. There may be busier times during the day
when everyone is trying to access similar data. 4. Type of report, the 15+ minute report may display more information such as
drilldowns. The more information you want to display, the more time it will take.

hope this helps,
Joe

“Daddona, Robert” 01/18/00 10:34AM >>>
Hi Everyone,

I need to pick some brains. I am trying to improve the performance of running Business Objects reports (Version 4.1) on MS SQL Server 6.5. Some of the reports take 15 minutes to run and others take 1 second. Does anyone have any ideas on how to improve the speed of the reports through Business Objects? through MS SQL Server? It seems like some calculations on the BO reports take forever. Any information, tips or hints would be a great help.

Rob Daddona - MedSpan Inc.


Listserv Archives (BOB member since 2002-06-25)

Hi Robert,

There is a lot of factors that slows the query execution, here is the description of some of them:

  • Index usage: Usually when there is no indexes or a bad usage of the indexes, affect the performance.

  • The join orders: When you build your universe, make sure your joins are correct, a DBA can help you to join your tables.

  • string concatenation: try to avoid cancatanation in your query, and do it at your business objects report level instead.

–> There is some tools(Like: Golden) that can display the QUERY PLAN for you, it will tells you the cost, the order of accessing the tables and the indexes used for that query.

Hope this helps
Hamid


Listserv Archives (BOB member since 2002-06-25)

Hi Robert,

There is a lot of factors that slows the query execution, here is the description of some of them:

And of course, the most important factor in dw applications, summary and join aggregates. Thankfully, as previously pointed out, the aggregate aware function can be used to let bo navigate the various levels of aggregation. Aggregation is beneficial in that many of the summarising functions and complex joins can be performed, say on a daily basis, at load time - therefore these operations are not required to be performed at execution time. For example, take a 2 million row fact table with the grain at the order line item level. If you aggregate this to say the day level, i.e. remove order number and order line item number (and of course summarising the measures), this will probably reduce to around the 3 to 400,000 mark. Approx 80% reduction in the number of rows.

If you aggregate further to say weekly level, this can be reduce to less than 100,000 rows. Further aggreations produce less beneficial returns, however, it all depends upon the specific requirements in terms of execution time, reporting frequency, storage restrictions etc etc.

The overhead in producing these aggregates in not restrictive. You could probably aggregate a 2 million row fact table to daily level in Oracle (using Parallel CTAS) in about 2-5 minutes, dependant upon server configuration.

Of course, the first place to start is to make sure that the databse is tuned optimally. i.e. sort area size, star transformation, db block size, multi bock read count, hash area size, indexing, physical file layout, disk configuration (i.e. striping, mirroring, cluster size etc), network bandwidth etc etc.

A book in itself - in fact, it may be worth investing in a db tuning manual, particularly the ones focused upon dw apps.

Regards

Brian Patterson


Listserv Archives (BOB member since 2002-06-25)