Showing posts with label configurations. Show all posts
Showing posts with label configurations. Show all posts

Friday, March 23, 2012

Package setup, configurations and testing

Hi,

I'm looing for advice on how to organize my SSIS packages. I'll quickly describe my requiremenst, and then outline how I see the solution.

1. Should work for development in BI Studio. Design time experience should be good, so static schema information should be available in all steps. Packages should run in debug mode inside BI Studio.

2. Should be able to run automated tests (which I gather is difficult. See proposed solution below). Tests should run both on a test machine, in batch mode, and from BI Studio.

3. Should work for deployment.

The packages should be fine-grained. Automated tests will be implemented as SSIS packages that call the packages to be tested via an Execute Package Task. There will be one test package per testcase. A test will look like: [Create Schema] -> [Load test data]->[Execute Package to be tested]->[Verify expected results]->[Teardown].

There will be one master test package that executes all the tests.

For this I need a) some way to vary the connections for the packages depending on whether they are run from a testcase or from the normal control flow. I'm considering making the connection strings configurations based on a parent package variable. Other configuration types strike me as not suitable since developers will want to run packages both as part of normal flow and as testcases on their machines. Is this a viable strategy? Any drawbacks?

b) need to organize testcases some way. I tried putting tests in a subfolder of the project to minimize the clutter, but BI studio shows them at the root node. I tried putting a new project (for containing the tests) in the same folder, and adding the packages (via add existing item), but BI Studio insists of adding copies (!) of the package files, rather than reusing the existing files. Any suggestions?

I welcome any feedback, suggestions of alternate strategies, etc.

/Anders

AndersI wrote:

For this I need a) some way to vary the connections for the packages depending on whether they are run from a testcase or from the normal control flow. I'm considering making the connection strings configurations based on a parent package variable. Other configuration types strike me as not suitable since developers will want to run packages both as part of normal flow and as testcases on their machines. Is this a viable strategy? Any drawbacks?

One possible drawback is that when the developers run in BIDS (not the test case), the packages will use the connection strings coded in the packages. I like using configurations for my packages even in development.

Another approach might be to use another type of configuration for the packages in dev, but when you run them from the test pacakges, use Execute Process instead of Execute Package. You can call DTEXEC from Execute Process, and override the connection strings by using the /CONN switch, which should take precedence over the configurations.

AndersI wrote:

b) need to organize testcases some way. I tried putting tests in a subfolder of the project to minimize the clutter, but BI studio shows them at the root node. I tried putting a new project (for containing the tests) in the same folder, and adding the packages (via add existing item), but BI Studio insists of adding copies (!) of the package files, rather than reusing the existing files. Any suggestions?

Can't you just put the test case packages in one project, and leave the real ones in another project?

|||

Thanks!

I'm using configurations for the master packages to set the connection variables, and then passing them down to child packages via parent package variable configurations.

The reason I want the packages-to-be-tested in the same project as the testcase packages is so that you can run them in the BI studio debugger. If the package called via Execute Package is not in the same project, then the debugger won't display it. [Edit] Scratch that, it works. Must have been a user error on my part... [/Edit]

The Execute Package task uses a connection with an absolute path to refer to the called package, right? What is the recommended practice for handling this in a development team where developers may have the files checked out to different locations on their machines? I'm considering adding a "PackagePath" variable (also stored in the configuration file) to the master packages, and using expressions on the form @.[User:Stick out tongueackagePath] + "ChildPackage.dtsx" for the connectionstring on the file connection manager. Has anyone else tried this?

/Anders

|||

AndersI wrote:

The Execute Package task uses a connection with an absolute path to refer to the called package, right? What is the recommended practice for handling this in a development team where developers may have the files checked out to different locations on their machines? I'm considering adding a "PackagePath" variable (also stored in the configuration file) to the master packages, and using expressions on the form @.[User:ackagePath] + "ChildPackage.dtsx" for the connectionstring on the file connection manager. Has anyone else tried this?

Yes, I've done exactly that. It works well.

Tuesday, March 20, 2012

Package configurations stored in the database

If the configurations are stored in the database then does the package pick up the configuration as and when required or it a one time pick when the package is executed.

What I plan to do?

When the package is executed, I want to have a script task at the start of the package that will update the configuration table in the database with the values that the package needs to run with. I mean change the values of the configurations of the tasks which will be executed after the script task.

Thanks for your time.

$wapnil

They are a one time pick up at the beginning of execution. Look at the "Execution Results" of one of your packages.|||

How to look at the execution results?

Thanks,

$wapnil

|||

spattewar wrote:

What I plan to do?

When the package is executed, I want to have a script task at the start of the package that will update the configuration table in the database with the values that the package needs to run with. I mean change the values of the configurations of the tasks which will be executed after the script task.

You will not be able to change the table content within the same package that will use it, becasue what Phil has just said.

But, why would you want to change it that way....actually where does that script will get the values from?

Have you looked to the Dtexec SET option to assign values to the package properties instead?

|||

spattewar wrote:

How to look at the execution results?

Thanks,

$wapnil

It's a tab in Visual Studio.|||

I think I got it all wrong. Let me try to explain

I have developed a single package and have stored the configuration details in the database in the table SSIS configurations. Now this package does the task of FTP connect, download files and upload into a destination table in the database. Now I want this package

1) to be able to connect to different FTP servers in a sequence one after the other. The FTP connection details are stored in one more table FTP_Details in the database.

2) OR be able to run multiple instance of this package at the same time, each connecting to different FTP servers and downloading files.

Could you kindly provide your inputs.

Thanks for your responses.

$wapnil

|||

A simple way may be to have a master package with a ForEach loop container and an Execute package task inside. The ForEach loop would iterate through the rows in FTP_Details table to get FTP connection details, and place it/them into a variable(s). Then the Child package will use Parent variable Package configuration to receive the proper connection details on each iteration.

Notice that this approach will not execute the package in parallel.

|||Rafael's got it covered, but in case you need more information (again, the forum search is your friend), here's a link on this very topic:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=729225&SiteID=1|||

Thanks Phil and Rafeal. I got the approach on how to do it now.

just one more question.

If I want to run multiple instances of this package at the same time then what should be the best approach.

again, thanks for your time.

$wapnil

|||You can't run the same package multiple times at once UNLESS you create a master package that calls the same package under an Execute Package task multiple times. Even then though, I'm not sure that's a good idea given that the packages will all share the same GUID (because they are the same!).

I don't understand why you'd want them to run at the same time, when kicking one package off in a foreach loop would be best. If you must run the same package multiple times then perhaps you'd be better off copying the package (physically) and then changing its GUID.

My only concern is how can you run the same package concurrently with another instance of itself without having one step on the other's toes?|||

I would used a foreach loop if it would have suffice our requirement.

One of the requirement could be that we want to download files from 4 different FTPs at the same time and process the files. does this call for creating four different packages , one for each FTP?

Can I not use the dtsexec utility and run the package multiple time by passing different arguments?

Thanks for your time.

$wapnil

|||

spattewar wrote:

I would used a foreach loop if it would have suffice our requirement.

One of the requirement could be that we want to download files from 4 different FTPs at the same time and process the files. does this call for creating four different packages , one for each FTP?

Can I not use the dtsexec utility and run the package multiple time by passing different arguments?

Thanks for your time.

$wapnil

Yes, calling dtexec would work just fine. If you want to make your FTP site list database driven, then no, it won't help. UNLESS, you passed in a key to dtexec or something, and that key matches up to a row in the database.

What we've been talking about here is how you can create one package which will loop through a table to setup the FTP connection to as many sites are listed in the table.

There are numerous ways to do this, of course.|||

Ok understand that.

Then maybe I can save the package configuration into an xml file and the provide that xml file as an input to the dtsexec. This way I can run the same package by passing different configurations files.

What do you say?

Thanks for your time.

$wapnil

|||

spattewar wrote:

Ok understand that.

Then maybe I can save the package configuration into an xml file and the provide that xml file as an input to the dtsexec. This way I can run the same package by passing different configurations files.

What do you say?

Thanks for your time.

$wapnil

Sounds okay to me, if that's what you want to do... I still question the need to run them at the same time.... If you're downloading gigs of data, I'm not sure that you'll get any faster results by running them concurrently versus serialized. The data pipe can only move so much data. So, with that said, you may want to look to build a more modular approach by using a database table to drive your connections.

However you decide, I think you've got enough information to make some progress.|||ofcourse I have. Thanks

Package configurations stored in the database

If the configurations are stored in the database then does the package pick up the configuration as and when required or it a one time pick when the package is executed.

What I plan to do?

When the package is executed, I want to have a script task at the start of the package that will update the configuration table in the database with the values that the package needs to run with. I mean change the values of the configurations of the tasks which will be executed after the script task.

Thanks for your time.

$wapnil

They are a one time pick up at the beginning of execution. Look at the "Execution Results" of one of your packages.|||

How to look at the execution results?

Thanks,

$wapnil

|||

spattewar wrote:

What I plan to do?

When the package is executed, I want to have a script task at the start of the package that will update the configuration table in the database with the values that the package needs to run with. I mean change the values of the configurations of the tasks which will be executed after the script task.

You will not be able to change the table content within the same package that will use it, becasue what Phil has just said.

But, why would you want to change it that way....actually where does that script will get the values from?

Have you looked to the Dtexec SET option to assign values to the package properties instead?

|||

spattewar wrote:

How to look at the execution results?

Thanks,

$wapnil

It's a tab in Visual Studio.|||

I think I got it all wrong. Let me try to explain

I have developed a single package and have stored the configuration details in the database in the table SSIS configurations. Now this package does the task of FTP connect, download files and upload into a destination table in the database. Now I want this package

1) to be able to connect to different FTP servers in a sequence one after the other. The FTP connection details are stored in one more table FTP_Details in the database.

2) OR be able to run multiple instance of this package at the same time, each connecting to different FTP servers and downloading files.

Could you kindly provide your inputs.

Thanks for your responses.

$wapnil

|||

A simple way may be to have a master package with a ForEach loop container and an Execute package task inside. The ForEach loop would iterate through the rows in FTP_Details table to get FTP connection details, and place it/them into a variable(s). Then the Child package will use Parent variable Package configuration to receive the proper connection details on each iteration.

Notice that this approach will not execute the package in parallel.

|||Rafael's got it covered, but in case you need more information (again, the forum search is your friend), here's a link on this very topic:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=729225&SiteID=1|||

Thanks Phil and Rafeal. I got the approach on how to do it now.

just one more question.

If I want to run multiple instances of this package at the same time then what should be the best approach.

again, thanks for your time.

$wapnil

|||You can't run the same package multiple times at once UNLESS you create a master package that calls the same package under an Execute Package task multiple times. Even then though, I'm not sure that's a good idea given that the packages will all share the same GUID (because they are the same!).

I don't understand why you'd want them to run at the same time, when kicking one package off in a foreach loop would be best. If you must run the same package multiple times then perhaps you'd be better off copying the package (physically) and then changing its GUID.

My only concern is how can you run the same package concurrently with another instance of itself without having one step on the other's toes?|||

I would used a foreach loop if it would have suffice our requirement.

One of the requirement could be that we want to download files from 4 different FTPs at the same time and process the files. does this call for creating four different packages , one for each FTP?

Can I not use the dtsexec utility and run the package multiple time by passing different arguments?

Thanks for your time.

$wapnil

|||

spattewar wrote:

I would used a foreach loop if it would have suffice our requirement.

One of the requirement could be that we want to download files from 4 different FTPs at the same time and process the files. does this call for creating four different packages , one for each FTP?

Can I not use the dtsexec utility and run the package multiple time by passing different arguments?

Thanks for your time.

$wapnil

Yes, calling dtexec would work just fine. If you want to make your FTP site list database driven, then no, it won't help. UNLESS, you passed in a key to dtexec or something, and that key matches up to a row in the database.

What we've been talking about here is how you can create one package which will loop through a table to setup the FTP connection to as many sites are listed in the table.

There are numerous ways to do this, of course.|||

Ok understand that.

Then maybe I can save the package configuration into an xml file and the provide that xml file as an input to the dtsexec. This way I can run the same package by passing different configurations files.

What do you say?

Thanks for your time.

$wapnil

|||

spattewar wrote:

Ok understand that.

Then maybe I can save the package configuration into an xml file and the provide that xml file as an input to the dtsexec. This way I can run the same package by passing different configurations files.

What do you say?

Thanks for your time.

$wapnil

Sounds okay to me, if that's what you want to do... I still question the need to run them at the same time.... If you're downloading gigs of data, I'm not sure that you'll get any faster results by running them concurrently versus serialized. The data pipe can only move so much data. So, with that said, you may want to look to build a more modular approach by using a database table to drive your connections.

However you decide, I think you've got enough information to make some progress.|||ofcourse I have. Thanks

Package Configurations - Using One config file for Multiple Packages

Hi All,

I am working on a project currently where we have many SSIS packages and we want to minimize the number of config files to be used. What I was thinking was using one master config file which will have Server/Database info which will be used by all the packages. Now the thing is there are multiple packages which are being executed from within a master package and I was wondering if this will be an issue. Is there anything else that I will have to keep in mind? Any help is appreciated.

Thanks

If my memory serves correctly then you will get lots of warnings about references in your config file to properties that don't exist in your package. But it should still execute OK.

-Jamie

|||Thanks Jamie

Package Configurations - some requests

To any Microsoft SSIS developers reading this forum, some requests:

1) do NOT allow a package to run if it has "package configurations" enabled and one or more XML files with configurations cannot be found;

2) update the properties window with the properties read from the "package configurations";

3) allow relative paths in the "package configurations" locations (relative to the package location)

Thank you :-)

Tom,

If you want to make requests for future enhancements then do so at Microsoft Connect.

Regarding point #1, quite a few people have asked for this but then alot of people during the beta timeframe asked for the behaviour to be the way it is now. You just can't please all of the people all of the time

-Jamie

|||

Hi Jamie,

Thanks for the link - will use (abuse) it.

Cheers,
Tom

Package Configurations

Hi,

I have a package configuration for my SSIS packages to allow me to dynamically change the connection string of the connections in my packages. This was done so that when I deploy my packages to my development or test server, the packages would use either the development or test server name in the connection string. I have set the configuration up to use an environment variable to store the location of the config file (an xml config file). The package however does not seem to be using the environment variable though. If I change the location of the config file in the variable's value, it still points to the old location. Can someone please help.

Thanks

With the above in mind, my end achievement would be to have the same packages deployed to my test and live environment at the same time but when I execute the packages in the test environment, it will use the connection string to the test environment and likewise when I execute the packages in the live environment, it must use the connection string to the live environment.

So, effectively, I am going to have a configuration file containing the connection string on both the live and test server. The packages therefore need to point to the respective files. Has anyone does something similar?

|||

Not sure is that is the reason of your issue; but BIDS will pickup changes to the environment variable only after close it and open it back.

The approach that you described in your second post is pretty common when you have 2 environments in separete machines. I don't see an issue on that.

Rafael Salas

|||Okay, sorted that out. Another issue that I have identified however is that the connection string that is now used seems to depend on who you have logged in to SSIS as (serious problem). For example, lets say on my test server, the configuration file is specifying 'Test' as the server name to use in the connection string. When I go to the physical server and get onto SSIS and check what the connection string is, it is correctly using 'Test' as the server name. Now, if I have to connect to SSIS on the test server from my machine and check what server name is been used in the connection string, it is showing the original server name that the package was deployed with (this is different to the server name specified in the config file on the test server). Do you know if this is a bug in SQL 2005 or am I missing something here?|||

Not sure if i am understanding this correctly. Anyway, when you use package configurations for setting up values at run time; they exist just at that time (at run time); whatever value are stored in the package the last time it was saved will be always there.

Rafael Salas

package configurations

Hi,

i keep struggling with dtsconfig.

i have 4 packages which all have the same datasource based on the global-datasource.

is there a way i can save that global-datasource, instead of configuring all 4 packages to use a config-file?

Mafti,

What is your issue?

Setting package configuration should be a 1 time setting during development; then you have to change the config file only when a change in the connection string occurs (server name, credentials, etc).

The other way would be to save the connection credentials (some sensitive data involved) within the package definition; but for that SSIS requires the sensitive data to be encrypted; creating other potential deployment issues. If you want to follow that path; look in BOL or this forum for a property at the package level called ProtectionLevel

|||

I think I have the same question.

I want to set my connection strings from a config file.

Is there some way to configure the Global Datasource to use an expression to set the connection string? It seems the only way to set it from a config file is to add a connection to the connection managers from the Global Datasource, then set the expression for the connection in the package itself. Thereby making the use of the Global Datasource useless in this instance, other than not having to retype the name.

|||Make it simple:

We have 1 global .dtsConfig file which stores db connection string (all packages have variable DB_CONN_STR) - and each DB provider you use - use expression to set the connstring

Also - for every package we have separate config file to set specific info like FTP user/pass etc.|||

that is what i am doing.

the problem is that there is no way to set up a global datasource to use a variable, or so it appears.

That forces you to either:

1) create a new connection each time and configure it to set the connection string from a variable (expression)

2) use a global datasource then modify the actual connection created from it to set the connection string from a variable (expression).

|||

Karstman wrote:

that is what i am doing.

the problem is that there is no way to set up a global datasource to use a variable, or so it appears.

That forces you to either:

1) create a new connection each time and configure it to set the connection string from a variable (expression)

2) use a global datasource then modify the actual connection created from it to set the connection string from a variable (expression).

No matter if you use a 'global data source'; the data source information is copied on each package that uses it. Thos solution for me is not to use those data sources at the project level as I don't see any added value on using them.

package configurations

Hi,

i keep struggling with dtsconfig.

i have 4 packages which all have the same datasource based on the global-datasource.

is there a way i can save that global-datasource, instead of configuring all 4 packages to use a config-file?

Mafti,

What is your issue?

Setting package configuration should be a 1 time setting during development; then you have to change the config file only when a change in the connection string occurs (server name, credentials, etc).

The other way would be to save the connection credentials (some sensitive data involved) within the package definition; but for that SSIS requires the sensitive data to be encrypted; creating other potential deployment issues. If you want to follow that path; look in BOL or this forum for a property at the package level called ProtectionLevel

|||

I think I have the same question.

I want to set my connection strings from a config file.

Is there some way to configure the Global Datasource to use an expression to set the connection string? It seems the only way to set it from a config file is to add a connection to the connection managers from the Global Datasource, then set the expression for the connection in the package itself. Thereby making the use of the Global Datasource useless in this instance, other than not having to retype the name.

|||Make it simple:

We have 1 global .dtsConfig file which stores db connection string (all packages have variable DB_CONN_STR) - and each DB provider you use - use expression to set the connstring

Also - for every package we have separate config file to set specific info like FTP user/pass etc.|||

that is what i am doing.

the problem is that there is no way to set up a global datasource to use a variable, or so it appears.

That forces you to either:

1) create a new connection each time and configure it to set the connection string from a variable (expression)

2) use a global datasource then modify the actual connection created from it to set the connection string from a variable (expression).

|||

Karstman wrote:

that is what i am doing.

the problem is that there is no way to set up a global datasource to use a variable, or so it appears.

That forces you to either:

1) create a new connection each time and configure it to set the connection string from a variable (expression)

2) use a global datasource then modify the actual connection created from it to set the connection string from a variable (expression).

No matter if you use a 'global data source'; the data source information is copied on each package that uses it. Thos solution for me is not to use those data sources at the project level as I don't see any added value on using them.

Package Configuration wizard:-SQL Configurations Configuration filter not working

Hi --I was wondering if this is a bug when I add new data in my table SSIS Confiurations and give wizard a new Configuration filter the package configuration wizard can not see the new values --the old values from the previous configuration are still showingis there any known workaround or forced refresh I can do

thanks in advance Dave

Background:

SQL Package Configurations are most important because they provide the possibility of a central configuration store for your entire enterprise!!!!!!!! and is in my mind the only way to go

http://sqljunkies.com/WebLog/knight_reign/archive/2005/01/24/6843.aspx

Wizard results:

Name:
ETL

Type:
SQL Server

Connection name:
ETLConfiguration

Any existing configuration information for selected configuration filter will be overwritten with new configuration settings.

Configuration table name:
[dbo].[SSIS Configurations]

Configuration filter:
PT_CUST_ABR

Target Property:
\Package.Variables[User::gsPreLoad].Properties[Value]
\Package.Variables[User::gsPostLoad].Properties[Value]
\Package.Variables[User::gsLoad].Properties[Value]
\Package.Variables[User::gsFlatFilename].Properties[Value]
\Package.Variables[User::gsFileName].Properties[Value]
\Package.Variables[User::gsCDOMailTo].Properties[Value]
\Package.Variables[User::gsCDOMailSubject].Properties[Value]
\Package.Variables[User::giRecordCount].Properties[Value]
\Package.Variables[User::giFileSize].Properties[Value]
\Package.Variables[User::giBatchID].Properties[Value]
\Package.Variables[User::gdFileDateCreated].Properties[Value]
\Package.Connections[MyDatabase].Properties[ServerName]
\Package.Connections[MyDatabase].Properties[InitialCatalog]

USE [ETLConfiguration]
GO
/****** Object: Table [dbo].[SSIS Configurations] Script Date: 05/23/2006 13:34:35 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[SSIS Configurations](
[ConfigurationFilter] [nvarchar](255) COLLATE Latin1_General_CI_AS NOT NULL,
[ConfiguredValue] [nvarchar](255) COLLATE Latin1_General_CI_AS NULL,
[PackagePath] [nvarchar](255) COLLATE Latin1_General_CI_AS NOT NULL,
[ConfiguredValueType] [nvarchar](20) COLLATE Latin1_General_CI_AS NOT NULL
) ON [PRIMARY]

I have just tried this scenario and cannot reproduce it. I added several SQL configurations and all worked fine.

I wonder if you could try to repro, while running SQL Profiler - that way you may discover if there are any issues in updating your config table.

Donald

|||

Thanks Donald --first off I have discovered a workaround. I think its way too risky and error prone to let the wizard write the actual values into the configuration table --you can imagine the chaos in large datawarehouse that has lots of new of records to enter into in the table!!!! Lets think it through--it I have say 5 new staging tables to add to the warehouse thats 65 manual entrys I would have to make with the wizard--no way is this workable

instead I prefer to sql script my new entrys and load in on hit

The workaround is to run the SQL insert script --then go to the wizard and change the configuration filter and go through to the endat this stage if you look at the variables they wont have changed since your last entry to the configuration table

Step1: Save the package

Step2: Exit out of visual studio completely and reopen again and go back into package

Hey presto the new values in the variables have changed

Can confirm this workaround only for SQL Server 2005

I have not tested for SQL Sever 2005 SP1

cheers Dave

Monday, March 12, 2012

Package Configuration for Oracle Connections

Hi there,

I created some configurations for my package as Jamie described in http://blogs.conchango.com/jamiethomson/archive/2005/11/02/2342.aspx
for my Connections. I use ADO Connections as well as OLEDB Connections for Oracle (MSDAORA).

Everything works very fine, until I store passwords in the file. Yes I know, I shouldn't, but
the main idea was to keep the machine-dependent data in an external configuration file,
so you don't need to re-build your package if you want to put it onto another
machine. This scenario is only possible, if the passwords are the same for the two machines, but in my case the passwords are different (even the user which isn't the prob).

The point is: Storing passwords in a configuration file works for OLEDB Connections to
an SQL provider but NOT to the Oracle provider mentioned above.

Any suggestion would be appreciated
Fridtjof

Hi,
If you have 2 different machines then that means 2 different connection strings which means 2 different configurations. You can't use the same configuration for both.

[As an aside, you can store multiple configurations in a single XML configuration file.]

-Jamie|||Hi Jamie,

you're right that you have to use two different configuration files.

In fact I do have two different configuration files, but it seems that the pwd for the
Oracle connection is not read when the package is opened for each of that configs.

In contrast to that, the pwd I have stored for the SQL-Connection is beeing read properly.

So... :(
|||Friedel,
I am able to change the password in the config file for an ADO oracle connection. I have not used theOLEDB Connections for Oracle (MSDAORA) however. Although I do not do this in my configs normally, so I tested it through VS not in a server environment. Does it work for you when testing through VS?

Adrian
|||Hi Adrian,

thanks that helped. I tried it with an ADO Connection. That worked!

By the way: Weren't there some posts about a performance issue concerning ADO.ORA vs. OLEDB.ORA Connections?

Greetings
Fridtjof
|||Friedel,
Yeah I had posted a question about performance a few months back...
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=87417&SiteID=1

Since then I've had the best performance with ADO for Oracle, however I don't have any concrete tests to site for this, just an observation. Plus the added benefit of using expressions on the datareader source in a dataflow(which I believe can only use ADO), gave me some dynamic flexibility as well.

Who knows, maybe some of the MSOFT guys have some concrete info on Oracle performance now with different providers?

Adrian
|||

Friedel wrote:

Hi Jamie,

you're right that you have to use two different configuration files.

In fact I do have two different configuration files, but it seems that the pwd for the
Oracle connection is not read when the package is opened for each of that configs.

In contrast to that, the pwd I have stored for the SQL-Connection is beeing read properly.

So... :(

I didn't say that that you have to use 2 different configuration files - I said you have to use 2 different configurations. They can both go in the same .dtsconfig as long as all packages that use the file contain all the objects referenced by the configurations with it.

I am using OLE DB Connection Manager against oracle and it works fine with the password stored in the config file. here's the connection string that I'm storing:

Data Source=ssafeukp.WORLD;User ID=*****;Password=*****;Provider=MSDAORA.1;Persist Security Info=True;

-Jamie
-Jamie

Package config errors - VS_NEEDSNEWMETADATA

Hi everyone,
I am having issues with using package configurations when just chaging the ServerName and InitialCatalog vars of a Connection Manager. I have DelayValidation = True on the Data Flow Task that is erroring out, but I am still recieving a VS_NEEDSNEWMETADATA error before the package is executed. The 2 boxes have identical tables I'm trying to access, although 1 is 2005 and the other is 2000. Any thoughts? I thought that package configs were supposed to be quick and painless?
Thanks,
Adrian
NeedsNewMetadata means that the component is detecting a difference between the table you initialized it with and the one you are running against, which is not surprising if you are changing SQL Server versions. Configurations are supposed to be quick and painless if used in a quick and painless way. Smile For example you wouldn't expect them to work if you switched between SQL Server and DB/2 for the database (at least I hope not Big Smile. Well 2000 and 2005 are quite different and this difference is manifesting itself by the component determining it needs a metadata refresh.

HTH,
Matt|||I would have that impression if using a non-Microsoft db product. However since the datatypes were the same and the connection manager type(oledb) worked to connect to both I assumed there would be no problems. But we know what they say about someone who assumes huh? If that is to be expected then I suppose there is nothing that can be done.
Thanks,
Adrian
|||Can you verify that the column names are identical as well as their types (and sizes)? In general we don't cause this type of failure based on the database back end but the component is determining that there is some difference between the metadata if has for the table and the metadata that the table being attached to is presenting to is. All I meant was that when switching to different databases it is easier for the mismatch to occur. I am not certain that was clear when rereading my post.

Thanks,
Matt|||I did verify the datatypes and sizes. I accepted 2005's table defaults in all other areas, so maybe something has changed there that I need to check. I will post back if those are identical and there are still problems.
Thanks,
Adrian
|||The column names are also very important as that is how we determine a metadata match. So if the names change then you will 100% of the time get this problem even if the types and sizes are identical.

Thanks,
Matt|||Are the column name matches case-sensitive? If so, there's my problem.
|||Yes, the check is case-sensitive. This is so we don't miss catching a bad case when a database is case-sensitive for column names.

Thanks,
Matt|||

Although I have seen this thread at a very later date, It helped me a lot . I was facing issues with VS_NEEDSNEWMETADATA error and the reason was because of case sensitive column names.

Thanks a lot

|||

We worked around this by setting the source to a SQL command rather than a Table/View. Do that in the Source Editor "Data Access Mode" box. Then type in your query... Select Column1 from Table1. No matter the case of Column1 on the back end, it woudl succeeds in our case.

This workaround keeps validation from being so strict on the column names.

Since our back end was case-insensitive SQL, our back end can handle the mixed cases of columns, so your SQL Command can succeed against the data source irrespective of column case, and so SSIS is not aware that the back end column metadata doesn't match the case of the SSIS package column labels.

Package config errors - VS_NEEDSNEWMETADATA

Hi everyone,
I am having issues with using package configurations when just chaging the ServerName and InitialCatalog vars of a Connection Manager. I have DelayValidation = True on the Data Flow Task that is erroring out, but I am still recieving a VS_NEEDSNEWMETADATA error before the package is executed. The 2 boxes have identical tables I'm trying to access, although 1 is 2005 and the other is 2000. Any thoughts? I thought that package configs were supposed to be quick and painless?
Thanks,
Adrian
NeedsNewMetadata means that the component is detecting a difference between the table you initialized it with and the one you are running against, which is not surprising if you are changing SQL Server versions. Configurations are supposed to be quick and painless if used in a quick and painless way. Smile For example you wouldn't expect them to work if you switched between SQL Server and DB/2 for the database (at least I hope not Big Smile. Well 2000 and 2005 are quite different and this difference is manifesting itself by the component determining it needs a metadata refresh.

HTH,
Matt|||I would have that impression if using a non-Microsoft db product. However since the datatypes were the same and the connection manager type(oledb) worked to connect to both I assumed there would be no problems. But we know what they say about someone who assumes huh? If that is to be expected then I suppose there is nothing that can be done.
Thanks,
Adrian
|||Can you verify that the column names are identical as well as their types (and sizes)? In general we don't cause this type of failure based on the database back end but the component is determining that there is some difference between the metadata if has for the table and the metadata that the table being attached to is presenting to is. All I meant was that when switching to different databases it is easier for the mismatch to occur. I am not certain that was clear when rereading my post.

Thanks,
Matt|||I did verify the datatypes and sizes. I accepted 2005's table defaults in all other areas, so maybe something has changed there that I need to check. I will post back if those are identical and there are still problems.
Thanks,
Adrian
|||The column names are also very important as that is how we determine a metadata match. So if the names change then you will 100% of the time get this problem even if the types and sizes are identical.

Thanks,
Matt|||Are the column name matches case-sensitive? If so, there's my problem.
|||Yes, the check is case-sensitive. This is so we don't miss catching a bad case when a database is case-sensitive for column names.

Thanks,
Matt|||

Although I have seen this thread at a very later date, It helped me a lot . I was facing issues with VS_NEEDSNEWMETADATA error and the reason was because of case sensitive column names.

Thanks a lot

|||

We worked around this by setting the source to a SQL command rather than a Table/View. Do that in the Source Editor "Data Access Mode" box. Then type in your query... Select Column1 from Table1. No matter the case of Column1 on the back end, it woudl succeeds in our case.

This workaround keeps validation from being so strict on the column names.

Since our back end was case-insensitive SQL, our back end can handle the mixed cases of columns, so your SQL Command can succeed against the data source irrespective of column case, and so SSIS is not aware that the back end column metadata doesn't match the case of the SSIS package column labels.

Package config errors - VS_NEEDSNEWMETADATA

Hi everyone,
I am having issues with using package configurations when just chaging the ServerName and InitialCatalog vars of a Connection Manager. I have DelayValidation = True on the Data Flow Task that is erroring out, but I am still recieving a VS_NEEDSNEWMETADATA error before the package is executed. The 2 boxes have identical tables I'm trying to access, although 1 is 2005 and the other is 2000. Any thoughts? I thought that package configs were supposed to be quick and painless?
Thanks,
Adrian
NeedsNewMetadata means that the component is detecting a difference between the table you initialized it with and the one you are running against, which is not surprising if you are changing SQL Server versions. Configurations are supposed to be quick and painless if used in a quick and painless way. Smile For example you wouldn't expect them to work if you switched between SQL Server and DB/2 for the database (at least I hope not Big Smile. Well 2000 and 2005 are quite different and this difference is manifesting itself by the component determining it needs a metadata refresh.

HTH,
Matt|||I would have that impression if using a non-Microsoft db product. However since the datatypes were the same and the connection manager type(oledb) worked to connect to both I assumed there would be no problems. But we know what they say about someone who assumes huh? If that is to be expected then I suppose there is nothing that can be done.
Thanks,
Adrian
|||Can you verify that the column names are identical as well as their types (and sizes)? In general we don't cause this type of failure based on the database back end but the component is determining that there is some difference between the metadata if has for the table and the metadata that the table being attached to is presenting to is. All I meant was that when switching to different databases it is easier for the mismatch to occur. I am not certain that was clear when rereading my post.

Thanks,
Matt|||I did verify the datatypes and sizes. I accepted 2005's table defaults in all other areas, so maybe something has changed there that I need to check. I will post back if those are identical and there are still problems.
Thanks,
Adrian
|||The column names are also very important as that is how we determine a metadata match. So if the names change then you will 100% of the time get this problem even if the types and sizes are identical.

Thanks,
Matt|||Are the column name matches case-sensitive? If so, there's my problem.
|||Yes, the check is case-sensitive. This is so we don't miss catching a bad case when a database is case-sensitive for column names.

Thanks,
Matt|||

Although I have seen this thread at a very later date, It helped me a lot . I was facing issues with VS_NEEDSNEWMETADATA error and the reason was because of case sensitive column names.

Thanks a lot

|||

We worked around this by setting the source to a SQL command rather than a Table/View. Do that in the Source Editor "Data Access Mode" box. Then type in your query... Select Column1 from Table1. No matter the case of Column1 on the back end, it woudl succeeds in our case.

This workaround keeps validation from being so strict on the column names.

Since our back end was case-insensitive SQL, our back end can handle the mixed cases of columns, so your SQL Command can succeed against the data source irrespective of column case, and so SSIS is not aware that the back end column metadata doesn't match the case of the SSIS package column labels.