Tuesday, March 20, 2012

Package dies after about 10000 seconds.

I have written a package that archives off old orders over night, it appears that this package is failing after about 10000 second every time it is run. I don't think it is memory as I am running it and checking for memory leaks.

Basic run down of package is

EXEcute SQL task to get orders to delete

If a for loop, loop each ordernumber

within the for loop there are 2 dataflow

dataflow 1

find related records in child tables (oldb connection using query)

using a mutli split first

check (with lookup) for records already in archive database

only copy on a fail from the look up

second

delete related records

dataflow 2

do the same but for the parent table

SP1 CTP is installed on server.

Any ideas?

You need to identify where the bottleneck is. Your log file should give some clues as to which task is taking a long time..

-Jamie

|||This might sound like a dumb question but were is my log file?|||

Not a dumb question at all.

You need to configure logging for your package. Right-click on the control-flow surface and select "Logging..."

-Jamie

|||If you have any connection to remote db's, you also want to check that. Some connection get lost or drop after runn continues for a couple of min, hour etc. (i happened to me once because the vendor from which i was downloading the data from had a batch process at their end that always interrupted my download)
Also, you can try doing batch insert to your achieve table by setting the Row per batch to a decent number. That may help it yours inserts are very very larg.

But first, check your loggs as jamei said

No comments:

Post a Comment