Document Tables

Hi,

This is probably for you Dave.

There are 3 tables OBJ_M_DOCUMENTS , OBJ_X_DOCUMENTS and DS_PENDING_JOBS.

Can you tell me what unique information do these 3 tables contain ?

Let me make a note of my findings/assumptions…

OBJ_M_DOCUMENTS -> Stores info about any document processed in Repository ( i.e either sent to DAS or sent to the repository to be shared by other users ). Each time the document is processed, a new row is inserted into the table with a unique document id.

OBJ_X_DOCUMENT -> It stores the graphic info of the document in blocks( i.e multiple records for same document ), if sent via DAS or stored in Repository. These records are deleted if they are either deleted form DAS (view log screen) or in Supervisor ( delete documents). Note :Deleting the document via Supervisor does not delete the info from DS_PENDING_JOB

DS_PENDING_JOB -> Stores info about documents run via DAS. Can be deleted only in DAS modile and does not get deleted even if user deletes it from Reported Module.

Please correct me if I am wrong and add to my findings…

The whole purpose of doing this is to purge records from OBJ_X_DOCUMENTS table whose size has grown to 1780 MB.

Thanks
Ashish


Listserv Archives (BOB member since 2002-06-25)

In a message dated 00-01-21 09:30:37 EST, you write:

OBJ_X_DOCUMENT → It stores the graphic info of the document in blocks( i.e multiple records for same
document ), if sent via DAS or stored in Repository. These records are deleted if they are either
deleted form DAS (view log screen) or in Supervisor ( delete documents).

This table also stores custom LOV’s that have been exported by a universe designer. As with documents, the LOV is stored in chunks of binary data and reconstructed as a .LOV file when the universe is imported.

There is a link - have not looked it up this morning - between the document table and the universe table to store the relationship between the object and the custom LOV file that has been exported.

You could probably find these entries by looking for document ID’s that do not exist in OBJ_M_DOCUMENTS. That’s a guess; I have not validated that assumption.

To delete these files, you would need to determine if the custom LOV is still in use, and if not, remove it from the universe. I would presume that would then delete the file from the repository table.

If you are concerned about space, you could retrieve all of the documents from the repository, purge them, and re-export them. That is, of course, assuming that you don’t need the data in the repository.

I believe that once you delete files you need to do a scan / repair / compact operation in supervisor. From what I remember, documents (or other items) are not actually deleted from the repository until this is done.

A side note about Oracle: deleting records will not shrink the space taken by the data. Essentially Oracle stores data in a parking lot. If the parking lot gets full of cars, then it rolls out more concrete and creates additional parking spaces. When the cars leave, the extra concrete is still there. You just have more spaces for potential cars. So, deleting data out of your tablespace is like having cars leave; you will not reclaim any extra room and be able to build something different unless you do some more work (by the DBA).

Regards,
Dave Rathbun
Integra Solutions
www.islink.com


Listserv Archives (BOB member since 2002-06-25)

Ashish,

One of the reasons for the huge space occupied by the OBJECT_X_DOCUMENTS table maybe because the document is sent to the repository with the data in it. It is always advisable to purge the data before sending the same to the repository, b’cos users can refresh the report whenever they retrieve it from the repository.


Prasad Shenoy
CMOS Soft Inc., CA.

Hi,

This is probably for you Dave.

There are 3 tables OBJ_M_DOCUMENTS , OBJ_X_DOCUMENTS and DS_PENDING_JOBS.

Can you tell me what unique information do these 3 tables contain ?

Let me make a note of my findings/assumptions…

OBJ_M_DOCUMENTS → Stores info about any document processed in Repository ( i.e either sent to DAS or sent to the repository to be shared by other users ). Each time the document is processed, a new row is inserted into the table with a unique document id.

OBJ_X_DOCUMENT → It stores the graphic info of the document in blocks( i.e multiple records for same document ), if sent via DAS or stored in Repository. These records are deleted if they are either deleted form DAS (view log screen) or in Supervisor ( delete documents). Note :Deleting the document via Supervisor does not delete the info from DS_PENDING_JOB

DS_PENDING_JOB → Stores info about documents run via DAS. Can be deleted only in DAS modile and does not get deleted even if user deletes it from Reported Module.

Please correct me if I am wrong and add to my findings…

The whole purpose of doing this is to purge records from OBJ_X_DOCUMENTS table whose size has grown to 1780 MB.

Thanks
Ashish


Free Internet Access from AltaVista: Get it, share it & win! Yahoo Search - Web Search


Listserv Archives (BOB member since 2002-06-25)

Prasad,

Thats a good suggestion.
But this holds good only for those documents which are stored in the repository to be shared by users.
What about those which remain in the OBJ_X_DOCUMENTS after they are sent via DAS ?

Thanks
Ashish


Listserv Archives (BOB member since 2002-06-25)

Dave,

This again is only for those documents which are sent to the repository to be shared by other users. What about those which are stored via DAS ?

Ashish


Listserv Archives (BOB member since 2002-06-25)

boundary="----_=_NextPart_001_01BF6446.20CE9AC4"

This message is in MIME format. Since your mail reader does not understand this format, some or all of this message may not be legible.


Listserv Archives (BOB member since 2002-06-25)

boundary="----_=_NextPart_001_01BF6446.A7B0C472"

This message is in MIME format. Since your mail reader does not understand this format, some or all of this message may not be legible.


Listserv Archives (BOB member since 2002-06-25)

7F537688E0D6D0119A0B00805FFEBE9403C6A17F@exc03crdge.crd.ge.com, mirji@CRD.GE.COM (Mirji, Ashish (CRD)) wrote:

Dave,

This again is only for those documents which are sent to the repository to be shared by other users.
What about those which are stored via DAS ?

Ashish

Don’t mess about - this is a joint bo/dba task. Export all reports from the repos to disk, then make a note of all the DAS/BCA job settings. Then, in the db, truncate the various tables, i.e. ds_pending_job, obj_x_documents. Then coalsesce the tablespace. This should free up all that extra space in there - the data file size won’t change, but there will be a lot less data for the db to trawl through. Once complete, send all reports back to repos and reschedule via DAS/BCA.

As someone mentioned previously though, purge all data from the reps prior to sending to DAS/BCA and repos, and dodn’t forget to repair and compact the repos after truncating the tables.

Brian Patterson


Listserv Archives (BOB member since 2002-06-25)

I see the dates in int10 format in the OBJ_X_DOCUMENTS table. Couldn’t make a sense out of the date i.e 1/21/2000 was stored as ‘918385883’. Any idea how to get them back in usual date format ?

Thanks,

Ashwini


Listserv Archives (BOB member since 2002-06-25)

The date is stored as (correct me on this) the number of seconds since some date back in 1975… you have to calculate the date from this start date - if it helps, I THINK it’s december 12th, 1975 - if you have a date you know, like update a report today, you should be able to calculate that from the _lad field in the repository…

Good luck,
Brent

I see the dates in int10 format in the OBJ_X_DOCUMENTS table. Couldn’t make a
sense out of the date i.e 1/21/2000 was stored as ‘918385883’. Any idea how to
get them back in usual date format ?


Listserv Archives (BOB member since 2002-06-25)

Here is a sample of the formula:

=‘12/15/1970’ + (( /3600) /24)

brent

I see the dates in int10 format in the OBJ_X_DOCUMENTS table. Couldn’t make a
sense out of the date i.e 1/21/2000 was stored as ‘918385883’. Any idea how to
get them back in usual date format ?


Listserv Archives (BOB member since 2002-06-25)

boundary="----_=_NextPart_001_01BF644C.3FD36486"

This message is in MIME format. Since your mail reader does not understand this format, some or all of this message may not be legible.


Listserv Archives (BOB member since 2002-06-25)

Here is the query that I use in a report to notify me via email the status of reports that ran the previous night. (say that 5 times fast) This is for SQL 7, but you should be able to do the same thing for any database. The format that the date comes in is “UDC” which means Universal Date Co-ordinate. Since I am in the Eastern Time Zone, this means that the date/time is the number of seconds since 7 PM on December 14, 1970. You need to convert the field to decimal, then calculate the number of seconds since that date/time (adjusting for your timezone), use those number of seconds to add to the Dec. 14th date to calculate the proper date.

select
right(businessobjects…obj_m_documents.m_doc_c_name,45), DATEADD(second,(convert(decimal,(businessobjects…ds_pending_job.SUBMIT_DATETIME))),‘1970-12-14 19:00:00.000’) ‘Submit Date/Time’,DATEADD(second,(convert(decimal,(businessobjects…ds_pending_job.START_DATETIME))),‘1970-12-14 19:00:00.000’) ‘Start Date/Time’, DATEADD(second,(convert(decimal,(businessobjects…ds_pending_job.END_DATETIME))),‘1970-12-14 19:00:00.000’) ‘End Date/Time’,businessobjects…ds_pending_job.JOB_STATUS,businessobjects…ds_pending_job.JOB_ERROR,businessobjects…ds_pending_job.ERROR_TEXTfrom businessobjects…ds_pending_job, businessobjects…obj_m_documents where businessobjects…ds_pending_job.document_id =businessobjects…obj_m_documents.m_doc_n_idGood luck!Simon-----Original Message-----I see the dates in int10 format in the OBJ_X_DOCUMENTS table. Couldn’t make asense out of the date i.e 1/21/2000 was stored as ‘918385883’. Any idea how toget them back in usual date format ? Thanks,
Ashwini


Listserv Archives (BOB member since 2002-06-25)

Dont create confusion…here is the simple format…

new_time(to_date(‘15-DEC-1970’,‘DD-MON-YYYY’)+(utc_date_field/86400),‘GMT’,‘EST’)

Ashish


Listserv Archives (BOB member since 2002-06-25)

Vinay,

OBJ_X_DOCUMENTS does not contain versions of the same document…U are mistaken when u must have seen many records for the same document. Those are actually blocks_ids which contains the graphical info, the data, and other features associated with the document. So all the block_ids collectively form one document. OBJ_M_DOCUMENTS is the one which contains multiple records for the same docuement everytime you run it.

Can any one tell me where the LOV definition is stored ?

Thanks
Ashish


Listserv Archives (BOB member since 2002-06-25)

boundary="----_=_NextPart_001_01BF6455.19BDC2BC"

This message is in MIME format. Since your mail reader does not understand this format, some or all of this message may not be legible.


Listserv Archives (BOB member since 2002-06-25)

Thanks for all the responses regarding the calculation of the document date. However, I wonder why BusinessObjects made such a simple thing so complicated? This is very specific to a user installation and one is just interested to know the most current date the document ran. Am I missing something?

Thanks,

Ashwini


Listserv Archives (BOB member since 2002-06-25)

Combining a lot of answers in one email…

In a message dated 00-01-21 14:14:08 EST, you write:

This again is only for those documents which are sent to the repository to be shared by other users.
What about those which are stored via DAS ?

Ashish

That’s what I meant by “assuming you don’t need the data in the repository”. DAS is - of course - designed to run a document and distribute the data. For those the “purge” suggestion makes no sense.

Someone else suggested:

Don’t mess about - this is a joint bo/dba task. Export all reports from
the repos to disk, then make a note of all the DAS/BCA job settings. Then, in the db, truncate the various tables, i.e. ds_pending_job, obj_x_documents. Then coalsesce the tablespace. This should free up all that extra space in there

NO NO NO! You should never truncate database tables that have referential integrity rules, not in a production environment. If you need to reclaim space, talk to the DBA about doing an export, compress the extents, and reimport. But do the ENTIRE repository at once, not just a few selected tables. Your DBA should know how to do this.

Can any one tell me where the LOV definition is stored ?

As I mentioned earlier, the LOV definitions are stored in the document domain tables. They are also connected to the universe tables.

In the UNV_OBJCONTENT table there is a column OBJ_LOVID. If this column is zero, then there is no List of Values that has been exported. If this column is non-null, then there is a custom LOV.

In the OBJ_M_DOCUMENTS table there will be a reference to a document with a M_DOC_N_ID that links to the OBJ_LOVID from the UNV_OBJCONTENT table. The value of M_DOC_C_NAME will be the name and some other information for the custom LOV. The table OBJ_X_DOCUMENTS contains the binary image of the LOV file that has been exported with the universe. The binary data has no meaning in its raw form, much like a standard document that has been exported. I don’t see any way of looking just at the OBJ_X_DOCUMENTS table and deciding what sort of value you are looking at.

A document that is a document rather than a custom LOV will have a document ID that is not found in the UNV_OBJCONTENT table. Under Oracle, you can determine this with the following:

select x_doc_n_id
from obj_x_documents
MINUS
select obj_lovid
from unv_objcontent

The values that show up are document IDs, rather than LOV IDs.

I have a document in DAS which runs with a custom script. It doesn’t go to
any user nor is it posted on web. It is rather faxed to the customer fax numbers which the script references from the CONTACTS of the NT Box
Outlook.

It has to appear as a report file somewhere. There is a file that is being printed to your fax driver. DAS does not have the capability to directly fax. You need to find out how the process is working and back in to the report file.

Now I know, I can change the status of report from Expired to waiting and make the existing copy run. But, I want to make the changes in this
document
now. So I want the copy of existing document and script. I didn’t send the copy last time to Repository.
How can we retrieve it from Repository database (Both document and script)
?

The server that runs DAS has a temporary directory. Files that are being processed under DAS are temporarily stored (thus the name) there. This includes the script. If you are quick enough, you can capture the script from the temporary directory. However, DAS deletes files from the directory when the job is done.

Perhaps you can get the server administrator to revoke Delete permissions on that directory? That way DAS might report the job as failed, but you would be able to get in and get a copy of the script.

I asked this before… are there no backups being done at this site? Surely you can go back - even a year if you have to - and find the source document.

As I wrote this, I got to thinking… where are the scripts stored in the repository? And if I can find them, are the stored as text (SPT) or tokenized (SPX) form? I believe that scripts are also stored in the OBJ_X_DOCUMENTS table, but have not been able to retrieve anything from the LONG RAW datatype used in my Oracle repository. Since I have not been able to retrieve anything, I can’t tell if they are in SPX or SPT form.

Regards,
Dave Rathbun
Integra Solutions
www.islink.com


Listserv Archives (BOB member since 2002-06-25)