SQL right truncation on Bulk loads

Hi all,

I’ve got this issue that has me a little confused.

When I try to bulk load (with truncate -> nightly rebuild) one of our reporting tables, I keep getting this SQL Right Truncation error on SQL Server 2012:


7284	6472	RUN-051017	3/03/2014 8:28:40 p.m.	|Data flow DF_Fact_Booking_Breakdown_Rebuild|Loader Calculate_Measures_TABLE_001
7284	6472	RUN-051017	3/03/2014 8:28:40 p.m.	Warning: Bulk loading into table <F> failed with input row data
7284	6472	RUN-051017	3/03/2014 8:28:40 p.m.	<13344444_288383838_AE,9go0T6TYzdsR6aBj6bsDNA==,1431,3690,NULL,20110416,20110416,20110416,19800101,19800101,0.5428,0.5428,97.161
7284	6472	RUN-051017	3/03/2014 8:28:40 p.m.	2,0.0000,97.1612,0.0000,0.0000,0.0000,14.5742,0.0000,0.0000,0.0000,2012.10.14 09:08:36,2012.10.09 17:29:06>.

There are a bunch of them, in a DF that processes 15+ million records.

Okay I thought - so, something doesn’t quite ‘fit’, I can sort that out?

But I can’t seem to find the offending column(s). All the data seems to perfectly fit with the data types in the table. All these decimal values go into DECIMAL(19,4) columns and the character strings go to into VARCHAR2(64) columns and the dates into DATETIME2 columns.

I’ve created a duplicate of that table structure in the development database. I took a few of the error records and did a manual INSERT INTO using the same data as captured in the Data Services logs. And no problem or warning or error whatsoever?

When I change the Data Flow to either pushdown to the database (by generating a large INSERT INTO statement) or by forcing it to flow through the Job Server, it reports no issues whatsoever.

However, I prefer to be using the bulk load feature - but how do I determine what the problem really is? And is there a problem to begin with or is this some odd behaviour associated with the bulk loader?


ErikR :new_zealand: (BOB member since 2007-01-10)

Whenever we do a bulk load, we should at least double the size of the VARCHAR Columns in Target as compared to SOURCE or triple it.

I know there is a SAP Note for this. Give it a try like that. I bet it should work.

Note: Do this only for the VARCHAR Columns.


ganeshxp :us: (BOB member since 2008-07-17)

Erik, in the error you posted I don’t see a right truncation error. Also, I don’t see any actual string values in that error. Just numbers and dates.

Does SQL Server have a VARCHAR2 data type? I thought only Oracle had that.


eganjp :us: (BOB member since 2007-09-12)

Tripple the size of the VARCHAR columns when using Bulk load? I better find that SAP Note before the DBA will have me shot!

Jim, you are - as usual - completely right, I was confused with a proper database :mrgreen: and had not quoted the error/warning message entirely:


 |Data flow DF_Table001|Loader Calculate_Measures_FACT_TABLE_001
 ODBC data source <SRVETL02> warning message for operation <bcp_sendrow>: 
<[Microsoft][ODBC SQL Server Driver]String data, right truncation>, SQLState <22001>.


ErikR :new_zealand: (BOB member since 2007-01-10)

Are you really using ODBC? If so, have you tried it with the native driver?

Do any of the source string values have a length that appears to be close to the declared length of the column? I’m wondering if you don’t have a high order character somewhere in your string that is causing it to take up double the amount of space. Declaring your columns as NVARCHAR can often alleviate the error that is thrown, if this is indeed the issue.


eganjp :us: (BOB member since 2007-09-12)

No, it’s configured to use the SQL Native Driver. In fact, the Job Server sits on the SQL Server 2012 box itself, so all the native connectivity is there. The data source has also been correctly configured so I have no idea why it’s throwing an ODBC error there. It could just be a generic message?

The max size of the VARCHAR column is 64, the most entries are 24 characters long so nothing that sits really close to that limit and no special characters either - it’s all low ASCII.

I could try NVARCHAR - I’m still trying to find a SAP Note that tells me to double/triple the varchar size when using bulk loads.


ErikR :new_zealand: (BOB member since 2007-01-10)

With regard to the double/triple issue…

There was a bug a few years/versions back where the import of a table would incorrectly set the width of the column in the metadata. I remember seeing an Oracle VARCHAR2(20) column imported as either varchar(60) or varchar(80). It was fixed in the 3.1 timeframe. That may be what Ganesh is referring to.

Since you’re getting a SQL Server error and not a DS error then the database is seeing some value that exceeds the defined length in the table. You could try adding some TRIM() functions on your string columns. Or perhaps even add a Validation transform that checks the length of the string columns.

Is your source a table or a file?


eganjp :us: (BOB member since 2007-09-12)

Erik,just curious, could you test the job by passing NULL hardcoded to all DATETIME columns ?


ganeshxp :us: (BOB member since 2008-07-17)

Yes correct! I am still thinking that will be a potential issue.


ganeshxp :us: (BOB member since 2008-07-17)

Thanks Jim and Ganesh - your insights and time is much appreciated.

I’ll try the NULLs as soon as I can. And I also tried putting TRIMS around it (substring and hard casts to correct data types) and while it did suppress the error, the performance was pretty hideous because it has to route everything through the job server.

The source of all of this, is actually a set of tables within the same database as the target, with the same data type definitions between source and target.


ErikR :new_zealand: (BOB member since 2007-01-10)

Letting it do a SELECT INTO is a good work around. But someday, some will make a change that prevents the SELECT INTO and then … BOOM!


eganjp :us: (BOB member since 2007-09-12)

I’d troubleshoot it be getting native SQL Server bcp to work, outside of BODS. BODS “goes through” bcp, I believe.


JeffPrenevost :us: (BOB member since 2010-10-09)