When I migrate some ETL jobs from development environment to production environment using ATL files, I exclude the datastores in the development environment and refresh the datastore with the table information in production environment.
But the ETL jobs fail to recognise this change.
Why does this happen and what is the work around for this?
This is exactly how it is done.
But there has been some mismatch occurring occasionally and that is being resolved only when the table definitions are also migrated from development env to production env.
There is no clue why this is happening. When though table is present, it says table does not exist and so on.
providing the that actual tables in the database are matched in dev and prod then I cant see why you are getting mismatches.
Could it be that there are multiple users migrating to production and someone is overwriting the table definition in prod with an out of date one??
If you are confident that the table definitions in development do match the ones in production then there wouldn’t be an issue including the table definitons with the Job ATL and just exclude the datastore definitions. Thats certainly what I do when I’m sure I know what is where.
Yes, after making changes in the database only, data store level export is done. But the changes were not getting reflected.
Multiple users are migrating to production, all using the excluding data store and tables mechanism.
There was a problem like Input Schema not matching with output schema.
Then in the table comparison transform, the table was re-selected, which resolved the problem. Not sure whether this table reselection should be done everytime.
But it appears something is getting missed out.
I think in one of the DI versions (11.7.2 ??) there was a bug whereby table comparisons weren’t getting updated after reimporting the tables and caused issues.
What version are you using? Maybe Werner will be able to help with this as it sounds like a bug I’m afraid.
exclude the data store settings but include the table definitions while exporting. i guess it should work fine with that. thats how we have been doing actually.
If that is multi user environment, why you are not using a CENTRAL Repo method? Because, even I have faced a few issues when I go with this .atl files. In fact it is tuff to maintain the versions also!!!
meta data to my knowledge is actually created for the job at the time of development. if you exclude the table definitions then means you are excluding some part of the meta data created.
hence it matching up that missing data with the one from production … dunno if it will work that way.
And how do you expect it to match it up with the metadata.
It can not be said as a work around as you say. to my knowledge all my friends do it that way only. and we haven faced that problem.
your point that it occurs in few jobs seems to be valid. looks like your other jobs have worked due to some reason. werner can probably explain us how the meta data is stored in the back end.
cos the jobs are built by joins between metadata tables. hence if they dont find the matching key. it wont work prop.
but when you bring the table definition also. it brings the same key also right. so might match…
I agree, I would always exclude the datastore but include the tables. Not 100% sure what is happening in your case, it could be as simple as after your export the re-import did not work or was forgotten or the prod database has the old table layout.
Reason why I would include the table is
it is part of the dataflow. when you export a WorkFlow you export its DFs as well, don’t you?
For safety reasons, maybe that table was newly created and doesn’t exist in the prod repo yet
Yes, the re-import was simply not working. That was the problem though at database level everything was fine.
The table existed in the production database, but the data flow failed to recognise it.
Both job and table re-import was done repeatedly. But no luck.
And some table comparison transforms also were not getting refreshed either.
Basically, I just ensured that no metadata from development environment came to production, since it is Live environment and we do have an option of selecting the table metadata directly from the prodution database.
Is it safe to include the table meta data from development to production?
Could you please confirm?