I understand that BO holds a unique id for each object in a
Does anybody know how to generate the following list for a specific universe
Class.object “Unique BO Object_id”
Listserv Archives (BOB member since 2002-06-25)
I understand that BO holds a unique id for each object in a
Does anybody know how to generate the following list for a specific universe
Class.object “Unique BO Object_id”
Listserv Archives (BOB member since 2002-06-25)
Does anybody know how to generate the following list for a specific
universe
Class.object “Unique BO Object_id”
xxxxx.xxxxxxx blah blah blah
The UNV_OBJECT table in the BO Repository contains all the information necessary to produce the results you desire. The following SQL statement should be a good start:
“SELECT Universe_ID, Class_ID, Object_ID, Obj_Name FROM UNV_OBJECT WHERE Universe_ID=20”
Adios,
January
Listserv Archives (BOB member since 2002-06-25)
O.K. you got me thinking. There is only one image/blob field in the repository database (X_DOC_B_CONTENT in OBJ_X_DOCUMENTS) where I have always assumed that all documents including the Universes were stored there. But since just about all of the Universe info is in the other tables then I have to wonder what the connection is between the binary Universe file and the data in the tables. Does the Universe get stored in all of the non-image tables or both? If it is both then what happens when you change something in the other tables? How does the local Universe file know that the database has been changed? Does it try to import the Universe to the local client again? If the Universe is not stored as an image, does this mean that when you import the Universe it creates a binary file on the fly? When we import a Universe over a dial-up connection a full Meg of data gets imported to create a 70 KB file. Any ideas on this?
Hope someone out there has some ideas or answers. Also does anyone have a data model of the repository? I have started to reverse engineer one but it is not going to be easy and will take awhile.
Thank you,
Simon
Listserv Archives (BOB member since 2002-06-25)
In a message dated 00-01-19 09:57:31 EST, you write:
O.K. you got me thinking. There is only one image/blob field in the repository
database (X_DOC_B_CONTENT in OBJ_X_DOCUMENTS) where I have always assumed that
all documents including the Universes were stored there. But since just about
all of the Universe info is in the other tables then I have to wonder what the
connection is between the binary Universe file and the data in the tables. Does the Universe get stored in all of the non-image tables or both?
The .UNV file gets broken down into components. Watch carefully on the bottom of the Designer screen when you export a universe and you will see the Classes, Objects, Joins, etc. being exported.
On import, the reverse happens: the components are downloaded and combined back into the single .UNV file.
If it is both then what happens when you change something in the other tables?
How does the local Universe file know that the database has been changed?
There is a flag in the repository that stores the last time the universe was changed. That flag is also stored on the local client copy. When a user logs in, the two flags are compared. If a change has been made then the universe is imported.
Does it try to import the Universe to the local client again?
Yes, if appropriate.
If the Universe is not stored as an image, does this mean that when you import
the Universe it creates a binary file on the fly?
Yes.
When we import a Universe over a dial-up connection a full Meg of data gets imported to create a 70 KB file. Any ideas on this?
I have seen lots of volume flying back and forth over dial up connections. Frankly, I suspect that the same volume of data moves during a normal connection it is just not as apparent due to the speed of the connection.
There is lots going on during the “log in” process that has nothing to do with the universe. The user record is queried from the repository. This includes checking for time stamps and universe overrides, two features that can be turned off. If you have a large number of users in your repository, this process can take a while. There are other factors as well. I suspect that you are dealing with more than just a universe.
I would suggest doing the same test when a universe is NOT being imported. In other words, sign on via a dialup connection and measure the data flow during the log in process. I suspect you will be surprised at the volume of data that moves just to get logged in!
As another test, using a dialup connection try using “Offline” mode. This will bypass the repository alltogether. Your login process should complete very quickly since none of the overhead is involved. However, you will not get any new universe updates.
Hope someone out there has some ideas or answers. Also does anyone have a data
model of the repository? I have started to reverse engineer one but it is not
going to be easy and will take awhile.
Sure… you do! Look on the BusObj cdrom. There should be a series of web pages that detail the tables of the repository and the joins. Be advised that some of the information is dated or appears to be incorrect based on what I have actually found in the repository tables, but the majority of the info seems to be correct.
Regards,
Dave Rathbun
Integra Solutions
www.islink.com
Listserv Archives (BOB member since 2002-06-25)
Hi Dave,
Thank you for your response. It is very valuable and has helped move me forward on my “big” problem. I will look for the joins info in the repository documentation as this will make the data model much easier to build! I have done testing with the logging in both with online and with offline mode.
I have told the users to use offline mode when they dial in. The problem is not with the logging in, but when they go create a report for the very first time. A full Meg of data gets downloaded and takes at least 20 minutes to create the 70 KB Universe file over a 56KB modem. I can download the Universe over the network in 11 seconds, but then I have a 100 MB connection.
Question: if I clean and delete a number of obsolete things in the database (which is why I need the data model for referential integrity), would this potentially create a smaller Universe file and therefore a faster import? Alternatively if I were to clean up the Universe deleting a lot of hidden “junk” and imported it only to change the exported flag back to the original date, would this help the new users? The “big” problem that I have is that I can not release new versions of the Universe through the repository to remote users because they will not be impressed with the instructions “dial in, log in, click on ‘tools – universes – import’, go for an extended lunch”.
Finally am I the only one in the world who is having a hard time getting Universes imported on machines that are dialed-in remotely?
Thanks again,
Simon
Listserv Archives (BOB member since 2002-06-25)
Finally am I the only one in the world who is having a hard time getting Universes imported on machines that are dialed-in remotely?
No you are not the only one. I’m reading this thread intensely!
Listserv Archives (BOB member since 2002-06-25)
In a message dated 00-01-19 12:44:28 EST, you write:
Question: if I clean and delete a number of obsolete things in the database
(which is why I need the data model for referential integrity), would this potentially create a smaller Universe file and therefore a faster import?
I would not suggest deleting anything from the repository directly. If you can, use the Supervisor program to review / delete any obsolete users, documents, universes, etc.
Directly manipulating the repository is not really a good idea, nor is it supported by BusObj. (I have to say that… it’s in my contract. )
Alternatively if I were to clean up the Universe deleting a lot of hidden “junk” and imported it only to change the exported flag back to the
original
date, would this help the new users? The “big” problem that I have is
that
I
can not release new versions of the Universe through the repository to remote
users because they will not be impressed with the instructions “dial in,
log
in, click on ‘tools – universes – import’, go for an extended lunch”.
Here’s a suggestion: why not email the .UNV file to your remote users? Once you have complete your modifications (cleaning, etc.) go ahead and export the universe. Then email the .unv file to the users with instructions on where to place the file.
You could of course ZIP the file to make the email process even faster if your users understand / are capable of unzipping properly. The shareware program WinZip (www.winzip.com) will even let you create an executable file that will unzip to the proper directory, no user input required.
I would test this on another workstation first, meaning make sure that receiving the updated UNV file via email is enough to circumvent the import process.
On the other hand, if your users are loggin in using “offline” mode then no import would happen anyway.
Regards,
Dave Rathbun
Integra Solutions
www.islink.com
Listserv Archives (BOB member since 2002-06-25)
Hi Dave,
The only issue to sending out the zipped file is that I have seen other threads where users could not create their own user defined objects when they manually copied the Universe to their hard drive. I was thinking that this was our only solution, and to deal with the complications as they arise. Thank you for the advice on not touching the repository! Simon
Listserv Archives (BOB member since 2002-06-25)
what I have actually found in the repository tables, but the majority of the
info seems to be correct.
Also on the cd-rom is a universe file called managero.unv - this gives basic reporting requirements from the repos - although it isn’t the best univese I’ve seen, it’s a place to start. If you want help on decrypting the various flag fields and formatting dates correctly, this info is available (although to be honest it’s a little obscure) on the cd-rom. I’m sure someone here can provide the info you require.
Regards
Brian Patterson
Listserv Archives (BOB member since 2002-06-25)
Question: if I clean and delete a number of obsolete things in the database (which is why I need the data model for referential integrity), would this potentially create a smaller Universe file and therefore a faster import? Alternatively if I were to clean up the Universe deleting a lot of hidden “junk” and imported it only to change the exported flag back to the original date, would this help the new users? The “big” problem that I have is that I
can not release new versions of the Universe through the repository to remote
users because they will not be impressed with the instructions “dial in, log in, click on ‘tools – universes – import’, go for an extended lunch”.
Simon,
Using the offline mode is fine except that then they do not ‘check-in’ to the repository and if their privileges change, they do not get these changes and can still ‘see’ all of resources that they could the last time they came in on-line.
As to the universe, I would be sure that there are no obsolete objects for the remote users. It may be worth evaluating their reporting requirements and changing the universe to the minimum objects, joins etc. necessary for their information (if it is any different).
Listserv Archives (BOB member since 2002-06-25)
Some of my users dial in remotely also and have had problems with connections being dropped. My connections are defined to stay active for 1 minute, then drop. I prefer them this way to keep the total number of logged-in users to a minimum. We have a main Oracle application which uses the same databases that Bus Objects does. What I have these users do is bring up the main application, log in, then minimize it. This keeps the connection open so Business Objects won’t take it down if it hasn’t got a response for a minute.
Works for us,
Roger Poole
“Clayton, Cindy, HRGBM” cindyclayton@ATT.COM on 01/19/2000 12:49:38 PM
Please respond to Business Objects Query Tool BUSOB-L@LISTSERV.AOL.COM
cc: (bcc: Roger Poole/N27/N20/NDept/NSWCDD)
Finally am I the only one in the world who is having a hard time getting Universes imported on machines that are dialed-in remotely?
No you are not the only one. I’m reading this thread intensely!
Listserv Archives (BOB member since 2002-06-25)
How long does it take them to import a Universe? Simon
Listserv Archives (BOB member since 2002-06-25)