I have a report with lots of data (> 4m records) being returned on different data providers and then displayed as html. There are calculations run against the data in busines objects before the data is finally displayed.
My problem seems to be that BO is chewing up all the memory on the BCA server, then crashing the server. The memory doesn’t seem to be freed up until the BO document is closed
Each data provider is on a separate sheet in the rep file. I have tried sending each sheet in the rep file to BCA as a separate job.
My question is, does the document close after each sheet is refreshed? The last time we ran three separate sheets through BCA it managed to eat 3GB memory
Any help/advice would be greatly appreciated as the problem is going to get worse and we are still in the process of justifying moving to Business Objects as a tool.
My personal experience is that BusObj will have problems when approaching one million rows in the microcube.
My advice would be to revisit why do you need more than 4 million rows brought back in a BusObjects document? Who would look at 4+ million rows anyway?
Is there anyhing you can do to process and “aggregate”/project those 4+ million rows at the DB and only return a fraction of it to BO?
One of the greatest challenges with displaying large reports as HTML is the HTML “overhead” can be much larger than the quantity of data you are trying to display. If possible, try to display these large reports via acrobat. You will run into a size ceiling with that too but it is much larger than with HTML.
I will stay off the soapbox on how much data you are retrieving
We are looking at ways to try and shift the load to a different server by datawarehousing and using database views, procedures etc but this is going to take time and money. Using the actual database to do the calculation is a no-go.
The 4 million rows are retrieved to give the user the ability to look back through previous months data and compare/contrast/check existing data. There is no way the amount of data being retrieved can be reduced any more.
The report is created in html as it is published to the intranet site (do not want to give licences to all users) and has a drill down facility to allow them to focus in on particular areas of interest.
The problem only exists when the .rep file is not closed by BCA after the refresh of each page.
We tested this out on the PCs to start with and watched the memory usage. Each time a query was refreshed the memory usage rose and wouldn’t come back down again until the document was closed. It uses all the memory and then gives an unhandled exception error and bombs out of BO.
On the BCA we get all sorts of weird error messages before the BCA crashes.
I like weird messages - wanna share? Seriously, might be worth having a look at this thread for UAEs, although I don’t think its going to get rid of them permanently for you. What version are you using, I wonder if an upgrade might help you out?
You know you said you experimented with sending each sheet as a separate report to BCA - did each report you sent have only a single sheet in it?
We are using 5.1.7 and looking to upgrade to version 6 in the near future.
I only have one report but there are multiple sheets within that report. We send each sheet to the agent to be refreshed individually- hoping that the file would close between each sheet and free the memory.
I can’t actually tell you the errors as there were loads appearing on the BCA. On the PC however the error was due to not having enough memory. We have now increasd the memory to 700M or therabouts.
Maybe I’m being a bit slow here. If you have a report with 4 sheets then change it to be 4 reports each with a single sheet. Purge the Data Providers and schedule all 4 reports. To start off with get the running one at a time - make sure only the one job runs at once. How much memory does BCA have? Make sure your page file is set to twice this amount and make sure it is static (i.e. no upper and lower limit).