Teradata, load triggers and transactions BT ET

Hi all,

I have a strange result when using Load Triggers with a Teradata 13 target (via ODBC).

Normal loaders will batch up inserts by the ‘rows per commit’ value, issue a BT, then the compound INSERT then an ET.

I have had to use a load trigger to replace the INSERT operation as I wish to use a slightly modified insert statement (a string literal with the ''XB qualifier to load a varbyte column). This works fine, but very slowly.

When we look at the teradata logs we see:

BT
INSERT … blah
ET
ET (Error 3510 Too many END TRANSACTION statements. )
ET
ET (Error 3510 Too many END TRANSACTION statements. )
BT
ET
(repeat)

Or to describe this, each insert is wrapped in BT/ET begin transaction end transaction, then there are several empty transactions.

I think even if the individual inserts were wrapped in a single BT/ET it would be much faster.

Does anyone know where all of these extra transactions are coming from?

From the manual, there is a bit of warning:

The software does not parse load triggers. Thus, when you specify a load trigger, the software does not parameterise SQL statements. As a result, load times might be higher when you use load triggers. 

…but nothing is said about transaction control that I can see?

Any help greatly appreciated!


philmorris :uk: (BOB member since 2002-11-12)

Are any of the columns in the Dataflow using the LONG data type?


eganjp :us: (BOB member since 2007-09-12)

Hi, No there are no LONG columns, but the behaviour is very similar, like it has automatically reduces ‘rows per commit’ down to 1.
There doesn’t seem to be any warning message logged.


philmorris :uk: (BOB member since 2002-11-12)