I have had reports in all sizes, up to 30MG. I’ve noticed that the amount of data initially displayed plays a big part in performance of the report. For example, I have one report (29MG) that has a hierarchy from Year -> Quarter -> Month -> Specialty -> Provider. Even though this report is huge, the initial data displayed on the report is less than one page. Performance on this report is not a problem. I have other reports that may be around 5MG, but only has a two level hierarchy, so there is a lot of data displayed. This report is very slow.
I’ve come to realize that I need to keep the total number of initial pages of the report down to aid performance. Therefore, on some of the reports that have been poor performers, I have added thresholds to the SQL to limit the data returned. For example, one report looks at medical procedure codes and drills down to providers (physicians) that bill them. There are close to 15,000 procedure codes, so that would be a huge report. My users are not really interested in procedure codes that are used infrequently. So the thresholds will limit the query only to the frequently used codes. Obviously, this does work for our users, but might not for yours.
What is considered to be a too big of a size report? 6MG +?
Mostly we had regular adhoc reports with max of 5 MG, but now we are converting all corporate reports from other reporting tools (QMF, EasyTrieve) to BO and we are starting to get into problems. Most of these reports retrieve a lot of data, more than 300,000 rows. The problem is not as much as running them (we can set them up with DAS or BCA), but it is a hassle to manipulate/maintain them. Every button we click (from editing a variable to going to Slice and Dice Panel), it takes 3-5 minutes.
Anyone out there with the same experience?
Thanks in advance and also wanted to thank you all for all the previous help given to all my stupid questions.
This list has been a life saver for us.
Edna
You wrote: < What is considered to be a too big of a size report? 6MG +?
Mostly we had regular adhoc reports with max of 5 MG, but now we are converting all corporate reports from other reporting tools (QMF, EasyTrieve) to BO and we are starting to get into problems. Most of these reports retrieve a lot of data, more than 300,000 rows. The problem is not as much as running them (we can set them up with DAS or BCA), but it is a hassle to manipulate/maintain them. Every button we click (from editing a variable to going to Slice and Dice Panel), it takes 3-5 minutes. >
IF it is necessary to work with such a large document, the best way is to make all variable / formatting changes using structure mode. This will mean that the report will not be calculated every time that you make a change to the report and response times are improved.
Kind Regards
Annaliza Janse van Rensburg
BO Product Manager
Usko Software
Hi Edna
Yes when we edit big reports with thousands of records it does take time as it has to recalculate the columns in the block. But before you are make any changes you can do the following
You can click on the structure button in the toolbar or choose View->Structure
This gives you the report struture. You can then make your changes. Once you have finished making your changes and you want to view those the data click on the above again or again choose View->Structure
Hope this information helps you.
Bye
Raheel
______________________________________________________ Get Your Private, Free Email at http://www.hotmail.com
Another possibility (more for the long haul) would be to consider using aggregate tables (summary tables) in your database structure, and using the @AggregateAware functionality to some of your objects. This could conceivably drastically reduce the number of rows being returned by some of your reports. One company who implemented summary tables and aggregate awareness reduced the average size for some its larger reports from over 450,000 rows to about 120,000 rows, and reduced the run time on the reports from a couple of hours to under 45 minutes.
Of course, this involves creating the summary tables, but as I said this is more of a long-haul type solution.
Tony Burton
Dear Edna
You wrote: < What is considered to be a too big of a size report? 6MG +?
Mostly we had regular adhoc reports with max of 5 MG, but now we are converting all corporate reports from other reporting tools (QMF, EasyTrieve) to BO and we are starting to get into problems. Most of these reports retrieve a lot of data, more than 300,000 rows. The problem is not as much as running them (we can set them up with DAS or BCA), but it is a hassle to manipulate/maintain them. Every button we click (from editing a variable to going to Slice and Dice Panel), it takes 3-5 minutes. >
IF it is necessary to work with such a large document, the best way is to make all variable / formatting changes using structure mode. This will mean that the report will not be calculated every time that you make a change to the report and response times are improved.
Kind Regards
Annaliza Janse van Rensburg
BO Product Manager
Usko Software
Of course, this could also be achieved by putting a Sum() around the measure object in designer. The report size could be drastically reduced and the aggregation is moved to the dbms. However, this might still take a lot of server time.