Showing posts with label failing. Show all posts
Showing posts with label failing. Show all posts

Wednesday, March 21, 2012

Package failure still causing partial load

I have a package that is failing because of a truncation error. Now, by default (and I leave this for ALL my packages) if one row fails processing the entire package should fail and nothing gets loaded into db. But instead I am actually getting a partial db load.

I have confirmed the "Rows per btach" value (blank) and the "Maximum insert commit size" value (0) for the OLE DB Destination Editor so I have no idea what is going on. Are there any other properties I should be checking?

Thanks.

Jason

Why are you surprised that there is a partial load? You'll have to set MICS to equal to or greater than the number of rows coming into the source to have it such that if one row is bad, the whole batch is aborted.

The settings you have now are likely committing each row as they are inserted.|||According to the documentation, a "value of 0 indicates that all data is committed in a single batch after all rows have been processed". This, to me, tells me that all rows will be written or none. Since I am getting a truncation error, should I not be getting zero rows written?|||Well, I'm not so sure that SQL Server can accept an arbitrarily large bulk load. That is, I believe there's a limit to the size of the batch, and if it's exceeded, it will have to issue a commit or fail. I'm not sure.

You could redirect error rows out of the OLE DB command and try to see where that error occurs (row number or something).|||

Thanks for the responses Phil.

I know exactly where the error is happening. Just that in the past (with other packages) the execution loaded everything or nothing. (No matter the number of rows in the source.) It is just that the behavior for this package is not what I am used to and I cannot figure out why it is doing a partial load when the settings (as far as I can tell) are telling it not to.

- Jason

|||

To be honest, I have been playing with those 2 properties in the OLE DB Destination, so I can get bigger batches but I only can get around 9K-10K rows per commit as maximum. Not sure if SSIS or Bulk_insert look at the available resources and decided what value to use. I have no gotten the time to dig into that.

|||

Rafael Salas wrote:

To be honest, I have been playing with those 2 properties in the OLE DB Destination, so I can get bigger batches but I only can get around 9K-10K rows per commit as maximum. Not sure if SSIS or Bulk_insert look at the available resources and decided what value to use. I have no gotten the time to dig into that.

I think that SQL Server can only handle a batch size of 256 MB.

http://msdn2.microsoft.com/en-us/library/ms143432.aspx

Tuesday, March 20, 2012

Package dies after about 10000 seconds.

I have written a package that archives off old orders over night, it appears that this package is failing after about 10000 second every time it is run. I don't think it is memory as I am running it and checking for memory leaks.

Basic run down of package is

EXEcute SQL task to get orders to delete

If a for loop, loop each ordernumber

within the for loop there are 2 dataflow

dataflow 1

find related records in child tables (oldb connection using query)

using a mutli split first

check (with lookup) for records already in archive database

only copy on a fail from the look up

second

delete related records

dataflow 2

do the same but for the parent table

SP1 CTP is installed on server.

Any ideas?

You need to identify where the bottleneck is. Your log file should give some clues as to which task is taking a long time..

-Jamie

|||This might sound like a dumb question but were is my log file?|||

Not a dumb question at all.

You need to configure logging for your package. Right-click on the control-flow surface and select "Logging..."

-Jamie

|||If you have any connection to remote db's, you also want to check that. Some connection get lost or drop after runn continues for a couple of min, hour etc. (i happened to me once because the vendor from which i was downloading the data from had a batch process at their end that always interrupted my download)
Also, you can try doing batch insert to your achieve table by setting the Row per batch to a decent number. That may help it yours inserts are very very larg.

But first, check your loggs as jamei said