BusinessObjects Board

BO Metadata Management - Impact & Lineage information

Hi,

I am in the process of configuring BO Metadata Management within our environment. For some reason, the impact and lineage information for our Data Services intergrator is not being produced. I have calculate usage and column mappings for the BODS repository and see the mappings information in the BODS repository tables. Looking through the download log, this below error is appearing. Has anyone seen this error before? If so, do you know how this can be resolved?

Thanks

Nilesh

[E-3026103] 2012-03-26 15:19:36.977 Java heap space
[E-3026103] 2012-03-26 15:19:46.071 Java heap space
[C-0000000] 2012-03-26 15:19:46.087 java.lang.OutOfMemoryError: Java heap space
at com.microsoft.sqlserver.jdbc.TDSPacket.(Unknown Source)
at com.microsoft.sqlserver.jdbc.TDSReader.readPacket(Unknown Source)
at com.microsoft.sqlserver.jdbc.TDSReader.readPacket(Unknown Source)
at com.microsoft.sqlserver.jdbc.TDSReader.readResponse(Unknown Source)
at com.microsoft.sqlserver.jdbc.TDSCommand.startResponse(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerStatement$StmtExecCmd.doExecute(Unknown Source)
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeQuery(Unknown Source)
at com.bobj.mm.integrator.di.DIobjects.SourceTargetMappingWork.queryForWorkToCollect(SourceTargetMappingWork.java:160)
at com.bobj.mm.integrator.di.DIobjects.SourceTargetMappingWork.execute(SourceTargetMappingWork.java:143)
at com.bobj.mm.core.WorkThread.run(WorkThread.java:74)

[C-0000000] 2012-03-26 15:19:46.087 java.lang.OutOfMemoryError: Java heap space
at java.lang.String.(String.java:208)
at java.lang.StringBuffer.toString(StringBuffer.java:586)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerStatement$StmtExecCmd.doExecute(Unknown Source)
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeQuery(Unknown Source)
at com.bobj.mm.integrator.di.DIobjects.SourceTargetMappingWork.queryForWorkToCollect(SourceTargetMappingWork.java:160)
at com.bobj.mm.integrator.di.DIobjects.SourceTargetMappingWork.execute(SourceTargetMappingWork.java:143)
at com.bobj.mm.core.WorkThread.run(WorkThread.java:74)

[E-3026507] 2012-03-26 15:19:47.398 SQL Errors encountered while processing Object Source Target Mapping. The connection is closed.
[E-3026507] 2012-03-26 15:19:47.398 SQL Errors encountered while processing Object Source Target Mapping. The connection is closed.
[E-3026507] 2012-03-26 15:19:47.398 SQL Errors encountered while processing Object Source Target Mapping. The TDS protocol stream is not valid.
[E-3026103] 2012-03-26 15:19:47.398 java.lang.NullPointerException
[E-3026507] 2012-03-26 15:19:47.413 SQL Errors encountered while processing Object Source Target Mapping. The connection is closed.
[E-3026507] 2012-03-26 15:19:47.413 SQL Errors encountered while processing Object Source Target Mapping. The connection is closed.
[E-3026507] 2012-03-26 15:19:47.413 SQL Errors encountered while processing Object Source Target Mapping. The connection is closed.
[E-3026507] 2012-03-26 15:19:47.413 SQL Errors encountered while processing Object Source Target Mapping. The connection is closed.
[C-0000000] 2012-03-26 15:19:47.429 java.lang.NullPointerException
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(Unknown Source)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeQuery(Unknown Source)
at com.bobj.mm.integrator.di.DIobjects.SourceTargetMappingWork.queryForWorkToCollect(SourceTargetMappingWork.java:160)
at com.bobj.mm.integrator.di.DIobjects.SourceTargetMappingWork.execute(SourceTargetMappingWork.java:143)
at com.bobj.mm.core.WorkThread.run(WorkThread.java:74)


Nilz07 (BOB member since 2007-05-22)

what is the version of BOMM ? what is the version of DS ?

problem could be with the JVM max memory setting to increase this do the following
increase the JVM memory for DS Integrator, you can do this when you schedule a integrator run from CMC, click on Parameters link, and check the value of JVM Agruments -Xms -Xmx

set these values to -Xms512m -Xmx1024m if different

the problem could be with the large number of column mappings, you may end up with large number of column mapping because of where clause, try regenrating the column mapping without the where clause, for this do the following
Close Designer

Update the AL_ATTR table to mark that column mapping is not done for the DF by running following SQL in DS Repo
UPDATE AL_ATTR SET ATTR_VALUE = ‘no’ WHERE ATTR_NAME = ‘column_mapping_calculated’ and ATTR_VALUE = ‘yes’;

DELETE FROM AL_COLMAP
DELETE FROM AL_COLMAP_NAMES
DELETE FROM AL_COLMAP_TEXT

go to %LINK_DIR%\bin open DSConfig.txt
set the value of IncludeWhereInCalculateColumnMapping to FALSE
Open Designer and do the calculate column mappings again

once its done, run the DS Integrator again


manoj_d (BOB member since 2009-01-02)

Hi Manoj,

I am using BOMM version 12.1.0.9 and using DS version 12.2.3.2.

I will try setting the values for the JVM Arguments first and see if this resolves the problem. Currently when trying to schedule, the default is set to -Xms64m -Xmx1024m.

I will let you know how I get on with this.

Thanks

Nilesh


Nilz07 (BOB member since 2007-05-22)

I changed the parameter but still had the same error come up. Can I increase the size further or is this the max? We would like to see the where details in the lineage/impact analysis in BOMM also.

Thanks

Nilesh


Nilz07 (BOB member since 2007-05-22)

the problem with where clause column mapping is , it creates a relationship with each of the column of the target table, for example if you have 1000 columns in target table, the column of where clause will be associated to all 1000 columns of target resulting in too many rows to process

the where clause is actually impacts a table or row, ideally there should be separate MM obejct called filter/condition and instead of creating a relationship with each column of target a relationship should be created between this filter/condition object and the target table, something similar to Universe Filter and report

coming to the issue that you are seeing, you can increase it a little more (max around 1400) but its a 32 JVM so you will not be able to increase it to a higher value

other thing is why is it taking so much memory, I am suspecting where clause mapping but that may not be the case
how many rows you have in AL_COLMAP tables ?
what is the number of rows that you see in the log file that DS is processing (source-target mapping) ?
how may system configurations you have ?


manoj_d (BOB member since 2009-01-02)

2727173

We currently have 3 setup; BOE, BODS and relational schema

Attached is a log file for your reference
integrator.zip (48.0 KB)


Nilz07 (BOB member since 2007-05-22)

looking at the log the processing is getting this error and it runs fine after that, I have seen this behaviour with DS integrator earlier it was not reproducible in my env

not sure if lots of warnings before this error might be causing this, somethign is going wrong when inserting the row in the JDBC driver

will it be possible for you to file a case with support ?


manoj_d (BOB member since 2009-01-02)

Hi Manoj,

Thanks for your help. I have opened a case with SAP just in case I could not find a resolution from the forum.

I will update this post with the findings from SAP.

Thanks

Nilesh


Nilz07 (BOB member since 2007-05-22)

the incident is not yet assigned to support once it’s done I can work with support on that, I only see Integrator.log in the attachments to debug this we will the DS Repo export to ATL file, can you attach the atl to the incident as well


manoj_d (BOB member since 2009-01-02)

I have added the ATL to the case, the support person asked for the ATL also.


Nilz07 (BOB member since 2007-05-22)

yes, I got the ATL from the case, I am able to reproduce the issue using you ATL, I am debugging it, strange thing is it contiues with the collection after some db related error


manoj_d (BOB member since 2009-01-02)

Yes, the same thing happened with me also. I let the column mapping calculation continue and it finished in around 1 hour.


Nilz07 (BOB member since 2007-05-22)

Hi Manoj,

Where do you see the db related error? I noticed also that it continues the collection but do not see any db related error logged anywhere.

The support engineer is having troubles generating the column mappings also. Eventually the column mappings do generate, but for some reason BOMM is not generating the data from the DS repository. I suspect it may be due to the large data volumes in the column mapping repository tables but the support engineer is not confirming this. :hb:

Nilz


Nilz07 (BOB member since 2007-05-22)