Showing posts with label child. Show all posts
Showing posts with label child. Show all posts

Wednesday, March 28, 2012

PackageStart/End Events

Just finishing of a large ETL system and have aquestion about the following:-

We have 164 child packages being called from a single parent package which is setup to perform logging to a SQL Server table. Anything that Errors or has Warnings is logged accordingly, however, how do you trap the PackageStart and PackageEnd events? when the child package knows nothing about logging?

My first thought was to have the Parent call the AddEvent SP with the appropriate values, but thought I may check here in case I missed something

In a parent/child package structure I place all my logging in one place - in the parent package. All events (including child package OnPreExecute/OnPostExecute events) "bubble-up" to the parent package where they are "trapped" by the log provider.

-Jamie

|||Thats exactly what I'm doing, I get everything but the start/end events.|||

Do you mean OnPreExecute & OnPostExecute?

I can't imagine why you're not getting them. it works for me!

-Jamie

|||The parent package adds entries to SYSDTSLOG90 for PackageStart and PackageEnd plus any other log entries (Warnings, Errors etc) from either the Parent or the Child, but excluding the PackageStart and PackageEnd entries for the Child package. Its those events I would like to get logged, now we have no warnings ot errors the log looks it bit thin with just Parnet PackageStart and then 4-5 hours later the Parent PackagEnd|||

Then I'm stumped!

The parent logs its 'PackageStart' , it the executes the 'Execute Package' Task a number of times all packages execute normally and bubble up any Errors or Warnings to the parent. The only thing that appears not to bubble up (and consequently not logged to sysdtslog90) are the Child PackageStart and PackageEnd events.

In the parent I have configured the 'Execute Package' Task so: - DisableEventHandlers=False, and EventHandlers for OnError, OnPostExecute, OnPreExecute, OnTaskFailed and OnWarning, each of these has no Tasks but merely serves as a place holder in order to set Propogate=False.

It's not causing any problem to our applicaion other than the fact that I am trying to write a UI that links our application maintained log with sysdtslog90

Any pointers gratefully accepted

Paul

PackageStart/End Events

Just finishing of a large ETL system and have aquestion about the following:-

We have 164 child packages being called from a single parent package which is setup to perform logging to a SQL Server table. Anything that Errors or has Warnings is logged accordingly, however, how do you trap the PackageStart and PackageEnd events? when the child package knows nothing about logging?

My first thought was to have the Parent call the AddEvent SP with the appropriate values, but thought I may check here in case I missed something

In a parent/child package structure I place all my logging in one place - in the parent package. All events (including child package OnPreExecute/OnPostExecute events) "bubble-up" to the parent package where they are "trapped" by the log provider.

-Jamie

|||Thats exactly what I'm doing, I get everything but the start/end events.|||

Do you mean OnPreExecute & OnPostExecute?

I can't imagine why you're not getting them. it works for me!

-Jamie

|||The parent package adds entries to SYSDTSLOG90 for PackageStart and PackageEnd plus any other log entries (Warnings, Errors etc) from either the Parent or the Child, but excluding the PackageStart and PackageEnd entries for the Child package. Its those events I would like to get logged, now we have no warnings ot errors the log looks it bit thin with just Parnet PackageStart and then 4-5 hours later the Parent PackagEnd|||

Then I'm stumped!

The parent logs its 'PackageStart' , it the executes the 'Execute Package' Task a number of times all packages execute normally and bubble up any Errors or Warnings to the parent. The only thing that appears not to bubble up (and consequently not logged to sysdtslog90) are the Child PackageStart and PackageEnd events.

In the parent I have configured the 'Execute Package' Task so: - DisableEventHandlers=False, and EventHandlers for OnError, OnPostExecute, OnPreExecute, OnTaskFailed and OnWarning, each of these has no Tasks but merely serves as a place holder in order to set Propogate=False.

It's not causing any problem to our applicaion other than the fact that I am trying to write a UI that links our application maintained log with sysdtslog90

Any pointers gratefully accepted

Paul

Monday, March 26, 2012

Package xml config files used in child packages?

I have a package that executes 3 other packages in it. From the command line I pass in which config file to use (just contains db/seerver) does this setting get carried over into the child packages as well?

If you are using the Execute Package task to execute child packages it looks like you bumped into the current SSIS limitation. Take a look at the following posting:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2078371&SiteID=1

Regards,

Yitzhak

|||

Seems like that problem is to do with setting variables at the command line in child packages?

Im just curious if I use a config file that specifies a connection to the database to use, whats the best way to pass that connection on to the child packages?

|||

You can pass information through the parent package variable (PPV) configurations. However, PPVs are applied after all other configurations, so this can cause some problems. For example, if you are trying to pass the connect string to a database to use the SQL Server configurations, the PPV isn't applied until after the SQL Server configurations are already done.

Another way to accomplish this is to use an Execute Process task instead of the Execute Packge task. Then you can call DTEXEC, and pass along whatever connection strings you want to use.

Finally, you can use a Script Task to load the child package from the SSIS object model, set values in the package, and launch it from that.

Package xml config files used in child packages?

I have a package that executes 3 other packages in it. From the command line I pass in which config file to use (just contains db/seerver) does this setting get carried over into the child packages as well?

If you are using the Execute Package task to execute child packages it looks like you bumped into the current SSIS limitation. Take a look at the following posting:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2078371&SiteID=1

Regards,

Yitzhak

|||

Seems like that problem is to do with setting variables at the command line in child packages?

Im just curious if I use a config file that specifies a connection to the database to use, whats the best way to pass that connection on to the child packages?

|||

You can pass information through the parent package variable (PPV) configurations. However, PPVs are applied after all other configurations, so this can cause some problems. For example, if you are trying to pass the connect string to a database to use the SQL Server configurations, the PPV isn't applied until after the SQL Server configurations are already done.

Another way to accomplish this is to use an Execute Process task instead of the Execute Packge task. Then you can call DTEXEC, and pass along whatever connection strings you want to use.

Finally, you can use a Script Task to load the child package from the SSIS object model, set values in the package, and launch it from that.

Package variables

Couldn't quite find the answer I was looking for via a forum search.

I have 9 packages that are currently called by a master package - all child packages take a datetime variable from the master package which is subsequently used in the child processes.

The question I have is that if I run the master package manually from Visual Studio I can set the master package variable manually to a date of my choosing and run it, which works fine. However we will be wishing to automate the package execution at some point and want to know the best way to run the package(s) on an automated basis and have the data variable supplied to the master package.

What would be the best way to do this ?

Presently we have a variable called MasterPackage_vLoaddate which is a DateTime data type.

Any help appreciated.

You can use the /SET option of dtexec.exe to set your variable at run time.

Or, you can use expressions on that variable if you want to use something like getdate().|||

Thanks Phil..

OK.. you'll need to tell me take me thru this in a little more basic detail

Let's say I wanted to run this job at 07:00 every morning. The variable (date) should always be for the prior day.

So I am guessing i'd need to use somthing like

dateadd(day,-1,getdate())

Can I just add that to the package in VS ? Or where exactly would I use this in order to set the variable correctly. Sorry for being such a noob!!

|||Yes, you have the correct formula.

So, in the properties of the variable you wish to work with, set "EvaluateAsExpression" equal to true. Then, in the "expression" parameter, add your formula. It's that simple.

Every time the package runs, this formula will be evaluated.|||

Will,

Keep in mind that if you want the ability of running the package for any date you choose, you would need to put extra logic on this (e.g. you need to reprocess a day that is not the current day; or if for some reason the package did not run in a specifc day).

You could easily implemnet that logic with a table that keeps track of the executions dates.

Just my two cents

|||

OK...

I'll test that in the morning. I used

set "EvaluateAsExpression" equal to true

and then used the formula getdate("d",-1,getdate())

I am supposing that it will just be a straightforward job of scheduling the master package run in SQL Agent and then the other packages will be called as per expectations ?

Cheers... your reply helped

|||

Will Riley wrote:

I am supposing that it will just be a straightforward job of scheduling the master package run in SQL Agent and then the other packages will be called as per expectations ?

Yep, that should be pretty straightforward.|||

Hi Rafael,

I do already have a table that logs package execution so that in theory, the same day cannot be processed twice. Would you suggest that I derive the date variable from that table (i.e. max date + 1 i.e. something like a SQL command.... SELECT dateadd("d",1,MAX(load_dates.date)) )

If so , how would I pass that to the package ? it might be something I hold in reserve in case the above proves unworkable from Phil.

Cheers

|||

Will Riley wrote:

Hi Rafael,

I do already have a table that logs package execution so that in theory, the same day cannot be processed twice. Would you suggest that I derive the date variable from that table (i.e. max date + 1 i.e. something like a SQL command.... SELECT dateadd("d",1,MAX(load_dates.date)) )

If so , how would I pass that to the package ? it might be something I hold in reserve in case the above proves unworkable from Phil.

Cheers

My solution will work, it just won't be as flexible as Rafael's.

For the table derived option, use an Execute SQL task in the control flow to select the max(date)+1 and then store the result into your package's date variable.|||

Just one more thing....

Am I likely to run in to an issue with times?

The above solution will equate to a datetime value of something like

11/02/2007 07:00

I would rather have

11/02/2007 00:00

As there may well be issues if the time is anything other than 00:00

Is there a simple bit of SSIS "expression" that will evaluate to "yesterday" at 00:00 ?

|||

Will Riley wrote:

Just one more thing....

Am I likely to run in to an issue with times?

The above solution will equate to a datetime value of something like

11/02/2007 07:00

I would rather have

11/02/2007 00:00

As there may well be issues if the time is anything other than 00:00

Is there a simple bit of SSIS "expression" that will evaluate to "yesterday" at 00:00 ?

Put this in front of your formula: (DT_DBTIMESTAMP)(DT_DBDATE)

So you would have:
(DT_DBTIMESTAMP)(DT_DBDATE)dateadd("d",-1,getdate())


If you don't even want the 00:00 on the end, don't include the (DT_DBTIMESTAMP) cast.|||

Phil Brammer wrote:


My solution will work, it just won't be as flexible as Rafael's.

I never meant it won't work. I just wanted to add a point I thought was important to get that issue solved.

|||

Will Riley wrote:

Hi Rafael,

I do already have a table that logs package execution so that in theory, the same day cannot be processed twice. Would you suggest that I derive the date variable from that table (i.e. max date + 1 i.e. something like a SQL command.... SELECT dateadd("d",1,MAX(load_dates.date)) )

If so , how would I pass that to the package ? it might be something I hold in reserve in case the above proves unworkable from Phil.

I think the approach beeing discussed would work fine; you just need to base the logic on that table you already have (good you already have it!).

Having a logic that depends in 'today' date is asking for trouble in long long term; when things do not happen/run as expected.

|||

Rafael Salas wrote:

I never meant it won't work. I just wanted to add a point I thought was important to get that issue solved.

I agree. Will mentioned it.|||

Phil, Rafael,

Thanks both - between you you provided both the answer(s) I needed and some good advice in terms of the use/dangers of dynamic variables.

Cheers

Friday, March 23, 2012

Package Time Outs

Okay, we have are running our Master Package (and therefore all related Child packages) through a .bat file. The .bat file is scripted using the following logic for an entire month of daily runs:

Code Snippet

DTExec /FILE E:\ETL\FinancialDataMart\Master.dtsx /DECRYPT masterpwd /SET \Package.Variables[ReportingDate].Value;"2/01/2007" > E:\ETL\ErrorLogs\Processing\etl_20070201log.txt
IF NOT %ERRORLEVEL%==0 GOTO ERROR%ERRORLEVEL%
mkdir E:\ETL\ErrorLogs\Archive\20070201
move E:\ETL\ErrorLogs\Processing\*.txt E:\ETL\ErrorLogs\Archive\20070201


DTExec /FILE E:\ETL\FinancialDataMart\Master.dtsx /DECRYPT masterpwd /SU /SET \Package.Variables[ReportingDate].Value;"2/02/2007" > E:\ETL\ErrorLogs\Processing\etl_20070202log.txt
IF NOT %ERRORLEVEL%==0 GOTO ERROR%ERRORLEVEL%
mkdir E:\ETL\ErrorLogs\Archive\20070202
move E:\ETL\ErrorLogs\Processing\*.txt E:\ETL\ErrorLogs\Archive\20070202

etc...

Generally it takes about 40-45 minutes to run one days worth of data. However, we have found unpredictable instances where the job will take 3 hours or even 6 hours and appear to hang....

The weirdness sets in when we kill the job and rerun it. In all instances of a rerun, the job will execute in the normal 40-45 minute time frame. Obviously, we would like to institute some sort of logging, monitoring and error handling....including if need be a method to timeout a process and restart it.

I am reviewing the WMI (Windows Management Instrumentation) Task but I'm not entirely convinced that it's the right tool for the job.

Questions:

    Has anyone else experienced the type of processing behavior that I described? Has anyone been successful at using WMI or another process to monitor and timeout packages? If so, are there sample packages or a good tutorial that maps it out? Unrelated to this issue, we also have instances incomplete processing logs. The logs don't finish writing and the weird part is that they all end at the same point, does anyone have experience with incomplete job logs?:

    Code Snippet

    Progress: 2007-06-20 12:46:49.87
    Source: Update factFinancial Data Flow
    Cleanup: 11% complete

Thanks in advance!

Sounds like you're encountering some sort of deadlock. It's not clear to me whether or not you've enabled logging in all the packages; have you?

Have you installed SP2? We added some more logging in SP2 around calls to external databases.

You shouldn't need to kill the packages, but if you really, really want to go that way, you could use a Execute Process tasks (calling DTExec) instead of the Execute Package tasks, and set the TimeOut TerminateProcessAfterTimeout properties.

Tuesday, March 20, 2012

Package error: Cannot create thread

I have a child package that has been run successfully multiple times in the last month +. Each time with roughly the same amount of data, give or take a few thousand rows.

Suddenly, this child package is now giving me the following errors from the log file:

Error: 2006-11-17 12:04:19.98
Code: 0xC0047031
Source: DFLT Primary DTS.Pipeline

Description: The Data Flow task failed to create a required thread and cannot begin running. The usually occurs when there is an out-of-memory state.

End Error

Error: 2006-11-17 12:04:20.03
Code: 0xC004700E
Source: DFLT Primary DTS.Pipeline

Description: The Data Flow task engine failed at startup because it cannot create one or more required threads.

End Error

I tried taking the child out of parent and running it by itself. I still get the same error. There are three other child packages that run on the exact same data and they have no problems. The control flow for the package first runs an SQL command. Then it has a data flow. The data flow grabs records from the source, adds two derived columns, looks up data and then stores to the destination. Relatively easy compared to other packages that are running just fine.

I've had our network people check the both the server running SSIS and the database server (two different machines) and there are no memory spike while the package is running.

Any ideas?

The problem could also be related with all the other applications/processes running on that box. Can you try stopping other SSIS packages and applications, and try running the problematic one? if that works, that means the machine is maxed out on available number of threads. or memory...

It's also possible that other applications might be running zombie processes on the box, even if the process seems to exit, there could be a thread or memory leak.

I'd use perfmon tool to read some of the critical resources for the box in which SSIS is running. You don't need to look at the database server, as SSIS creates a new process only on the box it's running. I'd check : thread/process, total thread, memory/process, total memory, page faults.

Package crosstalk?

I'm trying to convince myself that I am not crazy...

I'm having some odd things occur with a couple of child packages being called from a parent package. Inside the master package, I have two different Execute Package tasks that are contained in two different ForEach Loop containers. Both of these containers are downstream of another container. The only Precedence Constraints on either is "Successful".

One of these child packages loads a Household dimension table. The other loads a Customer-to-Account bridge table . I built each child package with a template, so there are many of the same variables, containers, etc.

The first time I ran the master package I noticed my Household package failed because its Flat File Connection Manager was trying to pick up the file my Bridge package needed. I then disabled the container in the master package with the Bridge task so that only the Household package would run. The next time I ran it, I put some PreExecute breakpoints in the Household package. When the breakpoint was hit, the Locals tab showed a mix of Household variable values and Bridge variable values.

So I went to open my Bridge package and the breakpoints I had set in the Household package were also set on the Bridge package. Yet, I had the Bridge package closed when I set the Household breakpoints. If I unset a breakpoint in one, it automatically unset it in the other.

I had seen this behavior before, but it cleared up and I guess I assumed I had dreamt it all up. Anyone else have any experience with this?This may help-->http://sqljunkies.com/WebLog/knight_reign/comments/5366.aspx|||I appreciate the link, R.K.S. Rather than helping me out, though, it's validating that what I'm seeing is bug-like.

I mentioned that the two child packages have like-named variables and that these two packages most likely would be executed at the same time as a result of their parallel position and precedence criteria following an upstream container.

The link states that the variables defined in the child packages themselves are hidden from each other. But with the parallel positioning of their respective Execute Package tasks in my master package, that's not what I'm seeing.

As a test, I moved the position of the Bridge's Execute Package tasks further downstream in my master package. The Household package picks up the correct values in its variables and finishes with no problem.

Curiously, the setting and unsetting of breakpoints across the two packages still occurs.|||Sorry, haven't had much experience with the situation you're describing. I do have a similar problem where I have two child packages that run fine independently but when I run them from a parent the variables in each respective child seem to "step on each other", for lack of a better term. The children were created from the same original package and contain the same variable names. All variables are scoped at the package(child) level.|||

mr_superlove wrote:


Curiously, the setting and unsetting of breakpoints across the two packages still occurs.

This may occur if one of the packages was a copy of another (e.g. you copied one package file and added it as second package). This can confuse the designer, as both packages have the same IDs and the tasks inside them have the same IDs, so the designer can't distinguish between them.

To fix this condition, open one of the packages (or both), make sure the package object is selected in Properties panel, find a property called "ID", click the drop down at the right and select 'Generate new ID'. Save the package. This should fix the problem.

|||Thanks, Michael. This partially works. Certainly at the package level generating a new ID prevents the package-level breakpoints from being confused. I can set the OnPreExecute breakpoint on my Household package and it not propagate to the Bridge package.

I'm still seeing confusion on breakpoints set at the container level, however. The IDs for these objects are grayed out. Any way to generate those?

I assume the like IDs are also the reason for the variable value cross-pollination I was seeing on my copied packages executing at the same time. Any way to just regenerate the whole lot of IDs? Thanks.|||

mr_superlove wrote:

I assume the like IDs are also the reason for the variable value cross-pollination I was seeing on my copied packages executing at the same time. Any way to just regenerate the whole lot of IDs? Thanks.

Here is an KB articule on how to re-generate package IDs

http://support.microsoft.com/kb/906564/

Rafael Salas

|||Thanks, Rafael, but I believe this link merely states an additional command line method of doing what Michael posted.

It doesn't address task IDs within the packages themselves. For instance, here are the IDs for the data flow task in each package:

{BD576B04-E131-479C-B720-75848C51F9E6}
{BD576B04-E131-479C-B720-75848C51F9E6}

These are grayed out and, at least within BIDS, static. I suppose I could modify the IDs manually in a text editor, but I shudder to think the problems that could cause.

It's a great learning experience, though. It certainly speaks to problems I've had in the past with copying data flow components. I'm not crazy after all.|||

I've had a problem with the fact that every "named" object is eventually reduced to a GUID, and only referenced by the GUID during execution (at least, that's how it appears). I believe you're bang-on that the problem you're seeing is caused by the same GUID being used to ID two different objects.

This is precisely why I've pretty much stopped copying and pasting objects -- BIDS does not do a good job of assigning new GUIDs to objects pasted into a project from the clipboard. I've had 'way too many problems because of this issue. (I really wish Microsoft had use a more Visual Studio-like paradigm, where each object was a discrete file like in a VB.NET or C# project. This reliance on XML is just too inefficient.)

I also believe you're going to have to manually edit the DTSX file using a text editor to fix the problem. There are several issues you're going to encounter, the first being that when you search for a GUID in the DTSX, how are you going to tell which object it references? If you manually update the GUID (like, changing the "E6" at the end to "E7", making sure that {BD576B04-E131-479C-B720-75848C51F9E7} doesn't already exist in the DTSX file), you have to be sure to update only the GUID that refers to Data Flow Task 2, and not Data Flow Task 1.

I've actually had good luck directly editing my DTSX, when needed. In many cases it's not to hard to figure out which block of XML referes to which object, based on the context in which it appears (other values around it, etc.). So far I've been able to patch in variables, variable assignments, configuration information, and other stuff. But, nothing quite as ambitious as you need.

My guess is that, other than completely rebuilding one of your data flow tasks, the only fix is going to be to directly edit the DTSX file (after backing it up, of course!).

Good luck!

Package configuration, Parent-child packages and Job scheduling

Hi,

I've found this problem that when I change settings in my configuration file it does not automatically apply to all child packages which uses the same configuration file if run from a job in SQL Server Agent. I need to open the package and save it again from BIDS. I use one "load group" package to execute all other packages.

Is there a way from the job configuration to set a setting so the package allways will have the newest configuration?

Regards
Simon
Hi,

Seems like noone has an answer for this.

Isn't it a fundamentally important that changes in configuration files automatically apply to packages that uses them. Am I the only one with this problem?

Regards
Simon
|||I haven't observed it myself. This only occurs when you are running from the SQL Agent? Does it work properly when you run from the command line?|||

When the package is loaded it will lok at the config settings. If you aren't passing entries from the parent to the child then it will get them from the config file (where are you holding the config settings - sounds like xml?).

You seem to be suggesting the parent package gets the correct values but the child packages get old values. I've not seen a package ignore a config file - are you sure the packages are reading the correct files? Are the files on the same server (where the packages are being run)?

Opening the package and saving it again shouldn't affect the config entries unless you look at them - in which case they will be overwritten with the values in the package.

|||

Hi,

I have the same problem with configuration file being ignored when executed ssis package via sql agent job. Have you found the solution to the problem yet?

regards,

mnguyen

|||

mnguyen wrote:

Hi,

I have the same problem with configuration file being ignored when executed ssis package via sql agent job. Have you found the solution to the problem yet?

regards,

mnguyen

If the package is running with the default values, maybe it is not able to find the configuration file. Are you sure it is in an appropriate location? Have you tried explicitly setting the configuration file via the /ConfigFile switch?

|||Thanks both to NigelRivett and jwelch for your suggestions and I'm sorry that I've havn't had the time to reply.

When I return from holiday I will investegate futher...
|||

mnguyen wrote:

Hi,

I have the same problem with configuration file being ignored when executed ssis package via sql agent job. Have you found the solution to the problem yet?

regards,

mnguyen

I spoke with mnguyen offline and we determined that it was an issue with using a relative path to the configuration file.

Package configuration, Parent-child packages and Job scheduling

Hi,

I've found this problem that when I change settings in my configuration file it does not automatically apply to all child packages which uses the same configuration file if run from a job in SQL Server Agent. I need to open the package and save it again from BIDS. I use one "load group" package to execute all other packages.

Is there a way from the job configuration to set a setting so the package allways will have the newest configuration?

Regards
Simon
Hi,

Seems like noone has an answer for this.

Isn't it a fundamentally important that changes in configuration files automatically apply to packages that uses them. Am I the only one with this problem?

Regards
Simon
|||I haven't observed it myself. This only occurs when you are running from the SQL Agent? Does it work properly when you run from the command line?|||

When the package is loaded it will lok at the config settings. If you aren't passing entries from the parent to the child then it will get them from the config file (where are you holding the config settings - sounds like xml?).

You seem to be suggesting the parent package gets the correct values but the child packages get old values. I've not seen a package ignore a config file - are you sure the packages are reading the correct files? Are the files on the same server (where the packages are being run)?

Opening the package and saving it again shouldn't affect the config entries unless you look at them - in which case they will be overwritten with the values in the package.

|||

Hi,

I have the same problem with configuration file being ignored when executed ssis package via sql agent job. Have you found the solution to the problem yet?

regards,

mnguyen

|||

mnguyen wrote:

Hi,

I have the same problem with configuration file being ignored when executed ssis package via sql agent job. Have you found the solution to the problem yet?

regards,

mnguyen

If the package is running with the default values, maybe it is not able to find the configuration file. Are you sure it is in an appropriate location? Have you tried explicitly setting the configuration file via the /ConfigFile switch?

|||Thanks both to NigelRivett and jwelch for your suggestions and I'm sorry that I've havn't had the time to reply.

When I return from holiday I will investegate futher...
|||

mnguyen wrote:

Hi,

I have the same problem with configuration file being ignored when executed ssis package via sql agent job. Have you found the solution to the problem yet?

regards,

mnguyen

I spoke with mnguyen offline and we determined that it was an issue with using a relative path to the configuration file.

Monday, March 12, 2012

Package Configuration

Here is the scenario we are

trying to use; (it a bit long winded please bare with me)

There are 2 Packages Parent

and Child. Both have 2 Connection Managers (��Configuration�� & ��Data Connection��).

The ��Data Connection�� has an invalid connection set to start with this is so

that we can deploy this to multiple places and use a SQL table to hold the

configuration.


At runtime ��Configuration�� is

set to a valid SQL2005 database connection via the command line or in BI but ��Data

Connection�� is left invalid.


The Package Configuration point

to an SQL server using ��Configuration�� this has the connection string for the ��Data

Connection�� Connection Manager (a valid connection string). The package errors

with can not acquire connection.


The first task is to put the

Connection string of the Connection manager ��Configuration�� in to a variable.

Then the child package is called.


The Child package has the same

2 connections ��Configuration�� & ��Data Connection�� both are Invalid at this

time.


1) We use the Package

Configuration to populate a variable from the parent package which holds the ��Configuration��

connection string.

2) We use a Package

Configuration to populate then Configuration�� connection from the parent

package which holds the ��Configuration�� connection.

3) We use a Package

Configuration to populate the ��Data Connection�� Connection Manager using the ��Configuration��

connection. Which should be Set via step 2 here.


The First task on the child

package has a bit of script to check all the connection manager return a valid

connection to make sure database are up and running before connection. This

fails at the moment return in an error.


We want to use the same code

in multiple environments and as such don��t want to make code changes that are really

only environment changes.

IE connection string to

servers. I can��t seem to see anything saying we are doping anything wrong but if

the connections are all set to a valid connection before running this seems to

work. When invalid connection are used this seems to fall over.

Peter you might be seeing the result of a known issue. When you define a package configuration using 'Parent package variable' type; no matter in which order you place it; it will be executed last.

You can see the details here: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=126146

PeterCh wrote:


The Child package has the same 2 connections ��Configuration�� & ��Data Connection�� both are Invalid at this time.


1) We use the Package Configuration to populate a variable from the parent package which holds the ��Configuration�� connection string.

2) We use a Package Configuration to populate then Configuration�� connection from the parent package which holds the ��Configuration�� connection.

3) We use a Package Configuration to populate the ��Data Connection�� Connection Manager using the ��Configuration�� connection. Which should be Set via step 2 here.

With this bug, your 2nd step occurs last; so your 3rd step would try to use the invalid connection string defined in the package; in consequence the configuration will not be actually performed.

Just to check, when running the packages in debug mode in BIDS, go to the progress tab and see in which order the package configurations are performed.

As a work around and since this issue only affects the behavior of ��parent package variable�� configuration type; instead of sending the connection string of ��Configuration�� via command line when calling the parent package; you can try to place it in a environment variable or file and then create a configuration on every package. Then create additional package configurations for the remaining connections managers using ��configuration�� connection manager.

I hope this help

Friday, March 9, 2012

p/c selfjoin question

Hi,
I need some help on a specific self join question.
I have a parent/child table:
CREATE TABLE [TAB_PC_KST] (
[Element] [varchar] (255) NULL ,
[Parent] [varchar] (255) NULL ,
[EType] [char] (1) NULL ,
[Weight] [float] NULL ,
) ON [PRIMARY]
GO
where EType is the element type and can be 'C' (for consoloidated element)
or 'N' (non consolidated element).
Lets say in that table is the following content:
Element Parent EType Weight
All Members <NULL> C <Null>
First Child All Members C 1
Second Child All Members N 1
First Sub First Child N 1
Second Sub First Child N 1
That will give a structure like the following
All Members
|--First Child
|--First Sub
|--Second Sub
|--Second Child
What I need is a select on that table, with the following result set
EType Element Weight
C All Members <NULL>
<NULL> First Child 1
<NULL> Second Child 1
C First Child <NULL>
<NULL> First Sub 1
<NULL> Second Sub 1
Is that possible without an cursor on all C-Elements? Perhaps with selfjoin
or
subqueries?
Thanks for any help!
Cheers
ThomasThomas
Look at greate script (example) written by Itzik Ben-Gan
IF object_id('dbo.Employees') IS NOT NULL
DROP TABLE Employees
GO
IF object_id('dbo.ufn_GetSubtree') IS NOT NULL
DROP FUNCTION dbo.ufn_GetSubtree
GO
CREATE TABLE Employees
(
empid int NOT NULL,
mgrid int NULL,
empname varchar(25) NOT NULL,
salary money NOT NULL,
CONSTRAINT PK_Employees_empid PRIMARY KEY(empid),
CONSTRAINT FK_Employees_mgrid_empid
FOREIGN KEY(mgrid)
REFERENCES Employees(empid)
)
CREATE INDEX idx_nci_mgrid ON Employees(mgrid)
INSERT INTO Employees VALUES(1 , NULL, 'Nancy' , $10000.00)
INSERT INTO Employees VALUES(2 , 1 , 'Andrew' , $5000.00)
INSERT INTO Employees VALUES(3 , 1 , 'Janet' , $5000.00)
INSERT INTO Employees VALUES(4 , 1 , 'Margaret', $5000.00)
INSERT INTO Employees VALUES(5 , 2 , 'Steven' , $2500.00)
INSERT INTO Employees VALUES(6 , 2 , 'Michael' , $2500.00)
INSERT INTO Employees VALUES(7 , 3 , 'Robert' , $2500.00)
INSERT INTO Employees VALUES(8 , 3 , 'Laura' , $2500.00)
INSERT INTO Employees VALUES(9 , 3 , 'Ann' , $2500.00)
INSERT INTO Employees VALUES(10, 4 , 'Ina' , $2500.00)
INSERT INTO Employees VALUES(11, 7 , 'David' , $2000.00)
INSERT INTO Employees VALUES(12, 7 , 'Ron' , $2000.00)
INSERT INTO Employees VALUES(13, 7 , 'Dan' , $2000.00)
INSERT INTO Employees VALUES(14, 11 , 'James' , $1500.00)
GO
CREATE FUNCTION dbo.ufn_GetSubtree
(
@.mgrid AS int
)
RETURNS @.tree table
(
empid int NOT NULL,
mgrid int NULL,
empname varchar(25) NOT NULL,
salary money NOT NULL,
lvl int NOT NULL,
path varchar(900) NOT NULL
)
AS
BEGIN
DECLARE @.lvl AS int, @.path AS varchar(900)
SELECT @.lvl = 0, @.path = '.'
INSERT INTO @.tree
SELECT empid, mgrid, empname, salary,
@.lvl, '.' + CAST(empid AS varchar(10)) + '.'
FROM Employees
WHERE empid = @.mgrid
WHILE @.@.ROWCOUNT > 0
BEGIN
SET @.lvl = @.lvl + 1
INSERT INTO @.tree
SELECT E.empid, E.mgrid, E.empname, E.salary,
@.lvl, T.path + CAST(E.empid AS varchar(10)) + '.'
FROM Employees AS E JOIN @.tree AS T
ON E.mgrid = T.empid AND T.lvl = @.lvl - 1
END
RETURN
END
GO
SELECT empid, mgrid, empname, salary
FROM ufn_GetSubtree(3)
GO
/*
empid mgrid empname salary
2 1 Andrew 5000.0000
5 2 Steven 2500.0000
6 2 Michael 2500.0000
*/
/*
SELECT REPLICATE (' | ', lvl) + empname AS employee
FROM ufn_GetSubtree(1)
ORDER BY path
*/
/*
employee
--
Nancy
| Andrew
| | Steven
| | Michael
| Janet
| | Robert
| | | David
| | | | James
| | | Ron
| | | Dan
| | Laura
| | Ann
| Margaret
| | Ina
*/
"Thomas Seidel" <tseidel@.misag.com> wrote in message
news:%234qjvclSFHA.688@.TK2MSFTNGP10.phx.gbl...
> Hi,
> I need some help on a specific self join question.
> I have a parent/child table:
> CREATE TABLE [TAB_PC_KST] (
> [Element] [varchar] (255) NULL ,
> [Parent] [varchar] (255) NULL ,
> [EType] [char] (1) NULL ,
> [Weight] [float] NULL ,
> ) ON [PRIMARY]
> GO
> where EType is the element type and can be 'C' (for consoloidated element)
> or 'N' (non consolidated element).
> Lets say in that table is the following content:
> Element Parent EType Weight
> All Members <NULL> C <Null>
> First Child All Members C 1
> Second Child All Members N 1
> First Sub First Child N 1
> Second Sub First Child N 1
> That will give a structure like the following
> All Members
> |--First Child
> |--First Sub
> |--Second Sub
> |--Second Child
> What I need is a select on that table, with the following result set
> EType Element Weight
> C All Members <NULL>
> <NULL> First Child 1
> <NULL> Second Child 1
> C First Child <NULL>
> <NULL> First Sub 1
> <NULL> Second Sub 1
> Is that possible without an cursor on all C-Elements? Perhaps with
selfjoin
> or
> subqueries?
> Thanks for any help!
>
> Cheers
> Thomas
>|||On Tue, 26 Apr 2005 14:04:23 +0200, Thomas Seidel wrote:
(snip)
>What I need is a select on that table, with the following result set
>EType Element Weight
>C All Members <NULL>
><NULL> First Child 1
><NULL> Second Child 1
>C First Child <NULL>
><NULL> First Sub 1
><NULL> Second Sub 1
>Is that possible without an cursor on all C-Elements? Perhaps with selfjoin
>or
>subqueries?
Hi Thomas,
The query below will not return the exact same set as above; the
differrence is basically formatting that's better done at the client side.
(It can be done in SQL - but it ain't pretty and it won't run quick)
SELECT a.Element, b.Element, b.Weight
FROM TAB_PC_KST AS a
INNER JOIN TAB_PC_KST AS b
ON b.Parent = a.Element
WHERE a.EType = 'C'
Best, Hugo
--
(Remove _NO_ and _SPAM_ to get my e-mail address)