I am working on a data flow that I inherited which was created in SAP BODS v 12.2.2.3. Our dev box has 16GB of RAM, and the DSConfig pageable cache buffer size is set to 0.
The data flow works on production, but exactly the same code is throwing an error on dev:
Initializing environment failed for pageable cache with error <-30975> and error message <DbEnv::open: DB_RUNRECOVERY: Fatal
error, run database recovery>.
I have a view as a source with cache = yes and array fetch size = 100 (default was 1000, I brought this down to see what it does), and two targets (one update, one insert), neither with bulk loading selected.
I am running the job with none of the statistics options selected.
Try running the job so that it gathers statistics. However, this part of the error “<DbEnv::open: DB_RUNRECOVERY: Fatal error, run database recovery>” may indicate a database issue, not a job server issue.
I reran using statistics for optimisation, and it is still giving the same issue. From what I have read online about this issue, collecting statistics on a job can sometimes force it to run in memory, even when it has been set to use pageable cache.
We have restored the latest db from production back to dev, and I have re-imported both the target and source tables in the respective datastores.
No, it’s the opposite way. When you run the job with “Gather statistics for optimiation” turned ON then the job will run everything using pageable cache.
When you turn ON the “Use collected statistics” the Dataflow may use in-memory depending on what the statistics have come up with.
If you set the Cache type property of the Dataflow to In-memory this it will always use In-memory without regard to the statistics (though it still gathers statistics, which is kind of dumb).
So with the db changes you made, did it make any difference?