Table metadata not getting refreshed

Hi,

When I migrate some ETL jobs from development environment to production environment using ATL files, I exclude the datastores in the development environment and refresh the datastore with the table information in production environment.
But the ETL jobs fail to recognise this change.

Why does this happen and what is the work around for this?

Could any one please give a prompt reply?

Regards,
Ijy


ijy_david (BOB member since 2008-12-03)

Can you just clarify that you mean you exclude the datastore and the table definitions when creating your ATL?

If so then in your case the correct order of migrating should be:

  1. Create the ATL in development, excluding the Datastore and Table definitions
  2. Import the ATL into production
  3. Refresh the table definitions as required.

ScoobyDoo :uk: (BOB member since 2007-05-10)

Thanks for the reply.

This is exactly how it is done.
But there has been some mismatch occurring occasionally and that is being resolved only when the table definitions are also migrated from development env to production env.

There is no clue why this is happening. When though table is present, it says table does not exist and so on.

Could you please help on this?

Regards,
Ijy


ijy_david (BOB member since 2008-12-03)

providing the that actual tables in the database are matched in dev and prod then I cant see why you are getting mismatches.

Could it be that there are multiple users migrating to production and someone is overwriting the table definition in prod with an out of date one??

If you are confident that the table definitions in development do match the ones in production then there wouldn’t be an issue including the table definitons with the Job ATL and just exclude the datastore definitions. Thats certainly what I do when I’m sure I know what is where.


ScoobyDoo :uk: (BOB member since 2007-05-10)

Yes, after making changes in the database only, data store level export is done. But the changes were not getting reflected.

Multiple users are migrating to production, all using the excluding data store and tables mechanism.

There was a problem like Input Schema not matching with output schema.
Then in the table comparison transform, the table was re-selected, which resolved the problem. Not sure whether this table reselection should be done everytime.
But it appears something is getting missed out.

Could you please advise?

Regards,
Ijy


ijy_david (BOB member since 2008-12-03)

I think in one of the DI versions (11.7.2 ??) there was a bug whereby table comparisons weren’t getting updated after reimporting the tables and caused issues.

What version are you using? Maybe Werner will be able to help with this as it sounds like a bug I’m afraid.


ScoobyDoo :uk: (BOB member since 2007-05-10)

Here we use XI 3.1 (12.1.0.0)


ijy_david (BOB member since 2008-12-03)

exclude the data store settings but include the table definitions while exporting. i guess it should work fine with that. thats how we have been doing actually.

just try that and lemme know.

Werner. could you please confirm on this.


dragonwhiz :cn: (BOB member since 2009-09-25)

Sorry to interrupt at this point!!

If that is multi user environment, why you are not using a CENTRAL Repo method? Because, even I have faced a few issues when I go with this .atl files. In fact it is tuff to maintain the versions also!!!


ganeshxp :us: (BOB member since 2008-07-17)

Excluding the data stores and including the table definitions do work. That is a work-around.

But if I dont want the development environment metadata to come to production and why is it not fetching it from the production metadata?

Ganesh.

We do use Central Repository for the development environment. But we are not providing connection to prod environment from it.

Regards,
Ijy


ijy_david (BOB member since 2008-12-03)

meta data to my knowledge is actually created for the job at the time of development. if you exclude the table definitions then means you are excluding some part of the meta data created.

hence it matching up that missing data with the one from production … dunno if it will work that way. :cry:
And how do you expect it to match it up with the metadata.

It can not be said as a work around as you say. to my knowledge all my friends do it that way only. and we haven faced that problem.


dragonwhiz :cn: (BOB member since 2009-09-25)

Only for certain few jobs this problem appears. Thats why, I mentioned it as a work around.

The same metadata is present in production as well. And the dataflows need to fetch the production metadata. Is this not right?

Regards,
Ijy


ijy_david (BOB member since 2008-12-03)

your point that it occurs in few jobs seems to be valid. looks like your other jobs have worked due to some reason. werner can probably explain us how the meta data is stored in the back end.

cos the jobs are built by joins between metadata tables. hence if they dont find the matching key. it wont work prop.

but when you bring the table definition also. it brings the same key also right. so might match…

werner… please comment on it


dragonwhiz :cn: (BOB member since 2009-09-25)

I agree, I would always exclude the datastore but include the tables. Not 100% sure what is happening in your case, it could be as simple as after your export the re-import did not work or was forgotten or the prod database has the old table layout.

Reason why I would include the table is

  • it is part of the dataflow. when you export a WorkFlow you export its DFs as well, don’t you?
  • For safety reasons, maybe that table was newly created and doesn’t exist in the prod repo yet
  • no harm in it being included

Werner Daehn :de: (BOB member since 2004-12-17)

Thanks for the mail.

Yes, the re-import was simply not working. That was the problem though at database level everything was fine.
The table existed in the production database, but the data flow failed to recognise it.
Both job and table re-import was done repeatedly. But no luck.

And some table comparison transforms also were not getting refreshed either.

Basically, I just ensured that no metadata from development environment came to production, since it is Live environment and we do have an option of selecting the table metadata directly from the prodution database.

Is it safe to include the table meta data from development to production?
Could you please confirm?

Regards,
Ijy


ijy_david (BOB member since 2008-12-03)

Yes, the table definition is part of the dataflow definition in my point of view.


Werner Daehn :de: (BOB member since 2004-12-17)