Wednesday, March 21, 2012

Package failure still causing partial load

I have a package that is failing because of a truncation error. Now, by default (and I leave this for ALL my packages) if one row fails processing the entire package should fail and nothing gets loaded into db. But instead I am actually getting a partial db load.

I have confirmed the "Rows per btach" value (blank) and the "Maximum insert commit size" value (0) for the OLE DB Destination Editor so I have no idea what is going on. Are there any other properties I should be checking?

Thanks.

Jason

Why are you surprised that there is a partial load? You'll have to set MICS to equal to or greater than the number of rows coming into the source to have it such that if one row is bad, the whole batch is aborted.

The settings you have now are likely committing each row as they are inserted.|||According to the documentation, a "value of 0 indicates that all data is committed in a single batch after all rows have been processed". This, to me, tells me that all rows will be written or none. Since I am getting a truncation error, should I not be getting zero rows written?|||Well, I'm not so sure that SQL Server can accept an arbitrarily large bulk load. That is, I believe there's a limit to the size of the batch, and if it's exceeded, it will have to issue a commit or fail. I'm not sure.

You could redirect error rows out of the OLE DB command and try to see where that error occurs (row number or something).|||

Thanks for the responses Phil.

I know exactly where the error is happening. Just that in the past (with other packages) the execution loaded everything or nothing. (No matter the number of rows in the source.) It is just that the behavior for this package is not what I am used to and I cannot figure out why it is doing a partial load when the settings (as far as I can tell) are telling it not to.

- Jason

|||

To be honest, I have been playing with those 2 properties in the OLE DB Destination, so I can get bigger batches but I only can get around 9K-10K rows per commit as maximum. Not sure if SSIS or Bulk_insert look at the available resources and decided what value to use. I have no gotten the time to dig into that.

|||

Rafael Salas wrote:

To be honest, I have been playing with those 2 properties in the OLE DB Destination, so I can get bigger batches but I only can get around 9K-10K rows per commit as maximum. Not sure if SSIS or Bulk_insert look at the available resources and decided what value to use. I have no gotten the time to dig into that.

I think that SQL Server can only handle a batch size of 256 MB.

http://msdn2.microsoft.com/en-us/library/ms143432.aspx

No comments:

Post a Comment