Showing posts with label process. Show all posts
Showing posts with label process. Show all posts

Friday, March 23, 2012

Package Update and Build Process

My SSIS solution has about hundred packages and time to time I have to edit a package. I understand I could use 'Build' command to compile only updated package, as opposed to Rebuild which recomplies all of the packages.

Nevertheless, in both cases SSIS opens all of the packages in design environment before compilation. My packages are saved in SourceSafe and that process takes quite long and I was wondering if there was any other way to compile only updated package where none of the other packages are opened during Build/Rebuild process? For example we could use dtutil to deploy only updated packages without running Package Installation Wizard.

Turn the of the "Build deployment Utility" option ala http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=874332&SiteID=1. With this option disabled, each package will cease opening every time you debug just one package via F5 or select the build project or build solution menu items.

For that matter, turn off the Integration Services project "Build" option in Visual Studio's Configuration Manager. SSIS in BIDS doesn't compile/build anything, but rather, copies your hundred .dtsx files to the project relative "bin\" subdirectory. Its doubtful you need four copies of each of the hundred packages, one each in source control, and three each in your local workspace, two of which are superflous (e.g. those copies in bin\ and bin\Deployment)

As you mentioned, use dtutil, or xcopy for that matter (if appropriate) for deployment, rather than the Package Installation Wizard. For example see http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1828408&SiteID=1, wherein dtutil is used for SQL server deployment.

|||

Thanks very much, your suggested approach would save me painful waiting time I had to endure before.

Asaf

sql

Wednesday, March 21, 2012

Package Execution takes 100% CPU

Hi all,

I have a package which updates a table which contains a large amount of rows, when this package is runnning the process consumes 100% CPU and other services are crashing (SSAS, SPS, SSRS).

Is there any parameter which could be set to avoid SSIS to consume 100% of the CPU during the execution of this package ?

Try to redefine your SSIS package...

Regards!

|||Is SSIS taking up the processor, or is it the relational engine? I assume it is the latter.

First off, if SQL Server is sharing the box with other processes, you need to limit the max amount of memory it can take. Otherwise, it is likely to hog it all and starve the other processes.

Second, if you have a lot of rows to update you should bulk insert the changes to a staging table and then run a single UPDATE statement joined between the staging and the target table. Using the OLE DB Command, SSIS can only update a single row at a time, which is very inefficient.

Finally, for a large table you need to be very careful about your indexes. A table with a lot of indexes will be much slower to update than one without. Ideally, you only want one index on this table, the one that will allow you to find the row to update. Obviously, you'll need to balance this with the intended use of the table.

|||Thanks for your answer.

Monday, March 12, 2012

Package cancelled

Hi,
I'm trying to transfer data from a DB2 database to a MS SQL Server database (approx. 150 000 rows).

When I execute the package, the process is always cancelled (In output window: SSIS package "DB2_test_TranAddl.dtsx" finished: Canceled.)

I use a view on DB2 to provide the data and I can modify it to return only a few rows. To achieve this, I use a WHERE clause: WHERE OUTLET_NO Between X And Y.

When I try OUTLET_NO Between 1 And 5 (255 rows) the package executes successfully.
When I try OUTLET_NO Between 5 And 10 (388 rows) it works.

When I try OUTLET_NO Between 1 And 10 (643 rows) it fails!!
Any idea?

How fast does each of the queries run (1-5 and 5-10)? Suppose the connector is timing out?|||

Phil Brammer wrote:

How fast does each of the queries run (1-5 and 5-10)? Suppose the connector is timing out?

A few seconds (less than 5)

The timeouts are set to 0.

I tried changing them to 60 seconds but it doesn't work...

|||

OK here's an update:

I was using the IBM driver, it is not functionnal.

I installed the Microsoft OLE DB provider for DB2 and (after spending like half an hour trying to configure the connection) it works. 150 000 rows transferred successfully in a little more than 8 minutes. Takes about 1min30sec. with the DTS version so I guess I have to do some tweaking...

EDIT: I use the FastLoad on the destination. Takes approx 1min30sec.

|||

Fleo,

I have the same performance issue you mentioned. 300,000 rows takes only 3 minutes with the ODBC connection in DTS, but more than 5 minutes with the microsoft OLD DB provider for DB2 in SSIS. Have you figured out any tweaking that may help speed up the transfer?

Thanks.

|||

Any reason you can't use an ODBC connection in SSIS?

-Jamie

|||I need to load data from SQL2005 to AS400 and there is no ADO.NET/ODBC destination. Also, the unicode issue.|||

Jane2006 wrote:

I need to load data from SQL2005 to AS400 and there is no ADO.NET/ODBC destination. Also, the unicode issue.

Right, well, let's keep your issue to the thread you started: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1723508&SiteID=1

You can use the OLE DB Destination to write to AS400.

Package cancelled

Hi,
I'm trying to transfer data from a DB2 database to a MS SQL Server database (approx. 150 000 rows).

When I execute the package, the process is always cancelled (In output window: SSIS package "DB2_test_TranAddl.dtsx" finished: Canceled.)

I use a view on DB2 to provide the data and I can modify it to return only a few rows. To achieve this, I use a WHERE clause: WHERE OUTLET_NO Between X And Y.

When I try OUTLET_NO Between 1 And 5 (255 rows) the package executes successfully.
When I try OUTLET_NO Between 5 And 10 (388 rows) it works.

When I try OUTLET_NO Between 1 And 10 (643 rows) it fails!!
Any idea?

How fast does each of the queries run (1-5 and 5-10)? Suppose the connector is timing out?|||

Phil Brammer wrote:

How fast does each of the queries run (1-5 and 5-10)? Suppose the connector is timing out?

A few seconds (less than 5)

The timeouts are set to 0.

I tried changing them to 60 seconds but it doesn't work...

|||

OK here's an update:

I was using the IBM driver, it is not functionnal.

I installed the Microsoft OLE DB provider for DB2 and (after spending like half an hour trying to configure the connection) it works. 150 000 rows transferred successfully in a little more than 8 minutes. Takes about 1min30sec. with the DTS version so I guess I have to do some tweaking...

EDIT: I use the FastLoad on the destination. Takes approx 1min30sec.

|||

Fleo,

I have the same performance issue you mentioned. 300,000 rows takes only 3 minutes with the ODBC connection in DTS, but more than 5 minutes with the microsoft OLD DB provider for DB2 in SSIS. Have you figured out any tweaking that may help speed up the transfer?

Thanks.

|||

Any reason you can't use an ODBC connection in SSIS?

-Jamie

|||I need to load data from SQL2005 to AS400 and there is no ADO.NET/ODBC destination. Also, the unicode issue.|||

Jane2006 wrote:

I need to load data from SQL2005 to AS400 and there is no ADO.NET/ODBC destination. Also, the unicode issue.

Right, well, let's keep your issue to the thread you started: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1723508&SiteID=1

You can use the OLE DB Destination to write to AS400.

Monday, February 20, 2012

overflow of the data type while processing dimension

While trying to process a cube the error below came up while processing one of the dimensions.

The database is an ORACLE database running on a UNIX platform even though that piece of information would not be neccesary in solving this riddle.

I believe it has to do with date format differences on both database platforms. Is there anyone out there who has been able to resolve such a problem or point me out in the right direction.

See below for an excerpt of the log.

Start time: 30-Mar-07 13:04:47; End time: 30-Mar-07 13:05:02; Duration: 0:00:15
SQL queries 1
SELECT
DISTINCT
"T_0"."DEALER_CODE" "DIMM_SUBSCRIBER_DEALER_XREF0_0"
FROM "DLRPROFILE"."DIMM_SUBSCRIBER_DEALER_XREF" "T_0"
Processing Dimension Attribute 'ENTRY DATE' failed. 1 rows have been read.
Start time: 30-Mar-07 13:04:47; End time: 30-Mar-07 13:06:06; Duration: 0:01:19
SQL queries 1
SELECT
DISTINCT
"T_0"."ENTRY_DATE" "DIMM_SUBSCRIBER_DEALER_XREF0_0"
FROM "DLRPROFILE"."DIMM_SUBSCRIBER_DEALER_XREF" "T_0"
Error Messages 1
Errors in the back-end database access module. OLE DB reported an overflow of the data type for column 0. Errors in the OLAP storage engine: An error occurred while the 'ENTRY DATE' attribute of the 'DIMM SUBSCRIBER DEALER XREF' dimension from the 'DEALER_SALES_CONNECTION_ANALYSIS' database was being processed.
Processing Dimension Attribute 'REGION CODE' completed successfully. 10 rows have been read.
Start time: 30-Mar-07 13:04:47; End time: 30-Mar-07 13:06:06; Duration: 0:01:19

Please another question is there some sort of literature addressing SSAS and ORACLE database working in tandem.

I need Help.......

What is the data type of the column for the 'ENTRY DATE' attribute in Oracle compared with the data type that is setup for the Attribute in the dimension? It sounds like the attribute is using a smaller data type.

Overcoming orphans

How do you restore master to a new database and not orphan users in the
process?
--
Message posted via http://www.sqlmonster.comLook up sp_change_users_login in BOL.
--
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"Robert R via SQLMonster.com" <u3288@.uwe> wrote in message
news:5d589125f94ca@.uwe...
> How do you restore master to a new database and not orphan users in the
> process?
> --
> Message posted via http://www.sqlmonster.com