I am being allocated 5 sessions per user for an account in an instance.
When I am trying to import the table into BODS, I am got the error below:-
Error cannot import metadata table. Simultaneous Session Exceeded Limit BODI 1112339 and ORA - 02391.
I tried on SQLPLUS and connect on 5 sessions. I am able to do so. I tried the 6th session, it give me the error which implies that the 5 simultaneous sessions profile is applied to the account. But does BODS required more than 5 sessions? Please advice.
Can anyone tell me exactly how many actual connections the BODI engine and Designer client uses because the DBAs are reluctant to set this limit to an arbitrary figure of 99.
There is no simple answer to this. The execution of a job will use as many connections as you have objects (Dataflows) running in parallel. Also, lookups that cache data will establish a connection to the database. (The Dataflow should release the connection after a lookup has cached the result set, but you never know.) So let’s say you have a Dataflow with two lookups. You could have three active connections. But what if there is another job running that connects to the same database? Your session limit is global. So if you have two jobs running in parallel and each job needs 3 connections/sessions then you will have exceeded the 5 session limit by 1.
Things get even more complicated if you have enabled parallel processing on your tables.
Thanks to Jim & Werner, does DS use Oracle’s OCI drivers then which I understand is a pre-requisitie to using Oracle’s Connection pooling (server-side pool called Database Resident Connection Pool (DRCP)).
Yes, DS will use the OCI driver. I was not aware that there was server side connection pooling for Oracle. Most of my clients use Oracle and this is the first I’ve heard of it. I’m not convinced that connection pooling is going to help you. But I look forward to being proven wrong.
It would be nice if DS support connection pooling on the job server.
DS does very poorly when it comes to connections. We pushdown two SQLs to the source? 2 sessions. In the query you have 5 lookups? 5 more sessions. You load one target with number_of_loaders=20? 20 more sessions.
We do not reuse connections in one dataflow, not across multiple dataflows and we keep them open for a longer time than required potentially, i.e. lookup is in cache mode hence we could create the connection when the cache is read and close it once the cache is loaded entirely.
By enabling connection pooling in the database, an idle connection does not cost you anything anymore. Especially with Oracle a dedicated connection is an expensive thing as it is backed by an Oracle shadow process, a process that does the database file reading for your session. So it is a good percentage of the database program code one session contains and that is huge.
If you are already exceeding the number of sessions then having a pool of connections means that either your ETL will be denied a connection from the pool, or if the pool is large enough (which wouldn’t make sense if your repressive DBA created it for you) you would simply exceed the sessions that way.
With some connection pooling that I’ve seen if an application makes a request for a connection and there are no available connections in the pool then the pool will simply make you wait (up to some time limit) until one becomes available. That would help in your situation so that your job pauses instead of failing due to the inability to connect to the database.
Try baking some cookies for your DBA. Maybe sweet talking the DBA into allowing more sessions will be the easiest approach. If that doesn’t work then a frank discussion with management is in order. The amount of time and effort you are going to have to put into working around the session limit will cost the company money. As a consultant I often have to be the bearer of this type of news to management. It’s not a lot of fun but it’s the right thing to do.
For what it’s worth, I really don’t have a problem with the approach that DS is using. I would rather use a large number of connections and have my ETL run fast than use a small number of connections and have my ETL run slow.
related to this i’m seeing sessions being left open on a Teradata box even after job has successfully completed and come to an end. When we restart the job SDS creates a whole lot of new sessions and eventually we max out the box.
Is there any session control functionality in SDS where we can persuade it to close sessions it no longer needs?
Which version of DS are you using? I remember seeing something similar to the situation you are describing in an early version of DS3.2. Luckily, I was working with some SAP consultants and they got an emergency patch very quickly.
sorry for reusing almost dead threads, but my issue matches perfectly here.
For my project I’ve created 5 Realtime-Jobs.
When publishing and starting these 5 Jobs, they consummate around 520 oracle processes (really processes, not sessions!).
Our Oracle DB is using dedicated connections.
Why so many connections?
Well, because of the high complex requirements I need a lot of Dataflows (as you can see in attached “JobDesign.png”) for creating some kind of “Framework”.
To provide data from one Dataflow to another, I use database tables (see attached “Merge.png”).
I already tried to decrease number of used tables by merging data before inserting it.
But the benefit was only 80 processes (better than nothing but I hoped for something about 200…).
Does anybody have similar problems?
Me and my colleagues are not sure if switching from dedicated to shared connection pooling on DB side would help as it’s a RealTime-Job and BO creates the connections no matter how they’re handled on DB side.
Please correct me if I’m wrong.
At the moment this is the first Project using the Framework, around 10 more will come.
And not sure if an Oracle DB is able to handle around 10000 Processes (for me and all the Jobs designed by my colleagues).
On a more serious side…
Processes vs Sessions: If you have parallelism turned on for your tables then a single Dataflow connection (Session) could very well spawn additional processes. Consider turning OFF parallelism to reduce the impact.
Using database tables in your “framework”: I have started using files for intermediate storage instead of database tables. This has improved the execution speed of my jobs (batch, not realtime) through a reduction in network traffic as well as database overhead. Plus, the DBA is quite happy to not have the redo/undo logs getting bloated.