Showing posts with label file. Show all posts
Showing posts with label file. Show all posts

Wednesday, March 28, 2012

Padding and Writing to a fixed format flat file!

Hi,

I am trying to write to a fixed format flat file using Flat File Destination Data Flow Component. I have all required information gathered from more than one sources. But when I tried to format the columns to a big string that will make up one line in the flat file, I could not figure out how to do that. Couple of issues that I am facing are:

    How to padd different columns? For example, One interger column has could be 1 to 10 character long in my case. When I convert to string, dont know how to padd the remaining characters i.e. if the value of integer is '1234', it should be written to file as '1234 ' . Which transformation is best in this case, if available? How to convert T-SQL datetime to a specific date and time format to write in the flate file? I have to write these date formats depending upon one of the parameters passed. Also, I dont want to put a delimiter at the end of each column, just the new line characters at the end of each record. Some of the columns has some unwanted characters (like new line characters) how to find them and remove them from the string. Can we directly write columns to a specific position in the flat file? e.g. col 1 a position 1 and col2 starts at postion 20 etc.

Your co-operation will be appreciated.

Thanks,

Paraclete

Take a look at a ragged-right or fixed width flat file connection manager. WHen you create the connection manager, you can choose ragged right or fixed width instead of delimited. This should solve a number of your issues.sql

pad the generated flat file with 0

Hi Guys,

Related to my last questions on SSIS work i'm doing, Is there a way to pad 0 on my generated flat file dynamically.

I'm getting the data from 1 table and then generating the file. The file i need to generate would have data at the desired location as the file is being used by another system.

Depedning on the data I want to put the padding of "0" and "3" inc certain fields. How am i suppose to do it.

Apart from this I would need to megre 2 or more column and before the merge do an airthmatic operation.

What would be the best component to use script component or derived column?

Ta

Gemma

Gemma,

Not sure if this will help or not, but it is how i've padded zero's before...

Use a derived column:

@.LENGTH is the length you would like to pad to)

@.MyStr is the character expression or string you would like to pad

SUBSTRING(REPLICATE("0", @.LENGTH), 1, (@.LENGTH + 1) - LEN(TRIM(@.MyStr))) + TRIM(@.MyStr)

|||

Here's another - using the same variables as above:

RIGHT(REPLICATE("0",@.Length) + @.MyStr, @.Length)

|||

Hi Guys,

I don't know whether this is completely what I asked.

I would need to pad the values dynamically. By this I mean the padding would be done depending on the actual length of data in the columns. So when the file is generated I would have to check each column and if certain column's data length is less then fixed length then pad it with zero.

Other then that how would I suppose to use these variables?

That is why I asked Script or derived column which one.

Just a quick note, the file I'm generating is going to be fixed width as i guess it and i'm generating a dummy file but with fixed width it's not putting the records in each line. Instead file is having all the records at the same line. Why? I've already checked. There are 2 columns which i'm writing. The first i'm putting in as 20 and the 2nd one as 35 width still i'm getting this problem.

Ta

Gamma

|||

The above expressions do exactly what you asked. The variables are simply placeholders in the code. If you know that your column is 20 wide and you want it to pad out the non filled data to 20 places (to the right) you would simply put your column name in the @.MyStr and 20 in the @.Length.

for example we have a membership number that needs to be padded out to 9 digits. The column's name is Member.

SUBSTRING(REPLICATE("0", 9), 1, (9+ 1) - LEN(TRIM([Member]))) + TRIM([Member])

So, what that does is takes a string of 9 zero's (Replicate("0", 9) and substrings out the 1, to 9 - trimmed column length and then adds the trimmed column information.

i.e. member number of "12345 " would yeild substring(replicate("0", 9), 1, 10 - 5) + "12345" > "0000" + "12345" > "000012345"

You would need to set this up for each column seperately.

hope this helps.

sql

Pad Spaces in FFConnection properties?

This is a considerably LOW priority, but is there a way to padd spaces in ssis flat file connection properties? We have cases where a CSV file will read in and throw a truncation error because the field length is greater then defined. However, the CSV file might look like ,"ABC ", where a number of trailing spaces are found. Is there some way to handle this without having to adjust teh field length, maybe ignore whitespace at the source?There is no way at the Flat File Source to ignore trailing whitespace, however there is a work around. Say in this case your field length is 3, and if there are trailing spaces after that, you want to ignore them. You can then use "Ignore Failure" for Truncation Error Disposition. That is, in the Error Output tab for the Flat File Source, for Truncation of this column, choose "Ignore Failure", instead of "Fail Component". By doing this, it will only get the specified characters, and ignore the rest, whether they are trailing white spaces, or some other data there. You have to make sure that this is not causing you to accept some data which you otherwise wanted to fail on really.|||

That is partially the problem, we only want to ignore whitespaces, but if there is actual chars then we would want to throw truncation error.

Thanks for your thoughts...

Monday, March 26, 2012

Package xml config files used in child packages?

I have a package that executes 3 other packages in it. From the command line I pass in which config file to use (just contains db/seerver) does this setting get carried over into the child packages as well?

If you are using the Execute Package task to execute child packages it looks like you bumped into the current SSIS limitation. Take a look at the following posting:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2078371&SiteID=1

Regards,

Yitzhak

|||

Seems like that problem is to do with setting variables at the command line in child packages?

Im just curious if I use a config file that specifies a connection to the database to use, whats the best way to pass that connection on to the child packages?

|||

You can pass information through the parent package variable (PPV) configurations. However, PPVs are applied after all other configurations, so this can cause some problems. For example, if you are trying to pass the connect string to a database to use the SQL Server configurations, the PPV isn't applied until after the SQL Server configurations are already done.

Another way to accomplish this is to use an Execute Process task instead of the Execute Packge task. Then you can call DTEXEC, and pass along whatever connection strings you want to use.

Finally, you can use a Script Task to load the child package from the SSIS object model, set values in the package, and launch it from that.

Package xml config files used in child packages?

I have a package that executes 3 other packages in it. From the command line I pass in which config file to use (just contains db/seerver) does this setting get carried over into the child packages as well?

If you are using the Execute Package task to execute child packages it looks like you bumped into the current SSIS limitation. Take a look at the following posting:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2078371&SiteID=1

Regards,

Yitzhak

|||

Seems like that problem is to do with setting variables at the command line in child packages?

Im just curious if I use a config file that specifies a connection to the database to use, whats the best way to pass that connection on to the child packages?

|||

You can pass information through the parent package variable (PPV) configurations. However, PPVs are applied after all other configurations, so this can cause some problems. For example, if you are trying to pass the connect string to a database to use the SQL Server configurations, the PPV isn't applied until after the SQL Server configurations are already done.

Another way to accomplish this is to use an Execute Process task instead of the Execute Packge task. Then you can call DTEXEC, and pass along whatever connection strings you want to use.

Finally, you can use a Script Task to load the child package from the SSIS object model, set values in the package, and launch it from that.

Package with File System Task doesn't work without sensitive data with user keys

This problem is a bit weird but I'm just wondering if anybody else experienced this.

I have a package that has file system tasks (copying dtsx files actually). Basically the package copies other packages to a pre-defined destination. Thing is, it only works if one of the packages it is configured to copy has some sort of sensitive data (e.g., a connectionstring with a password), otherwise it reports a success message on execution but doesn't actually do anything. I've checked the forcedexecutionresult and it is set to None for that matter.

Just wondering if anybody else experienced this problem and of course if there's a way to solve it.

Thanks.Could the security permissions be different on your packages that are failing to be moved from those that are succeeding?|||No, they're all set to EncryptSensitiveWithUserKey.

However, all of the packages involved have their sensitive information set to obtaining them from Indirect Configuration Files. The only way around that I found so far was to put in a dummy connection object, set up an XML file package configuration for that dummy, and deploy it that way.

I'm suspecting that the user key isn't generated at all when all the packages get their configs from Indirect Configuration Files. For some reason, DontSaveSensitive doesn't work either. My original package insists on the presence of a key. Might have something to do with file read/write/modify permissions that come with the key?

I still want to do away with the dummy though.

Friday, March 23, 2012

Package Validation Error

Hi,

I am using Look-UP Transformation to do ETL from a Flat file to a SQL Server Fact Table doing Keylookups from different Dimension Tables:

I am using two connections in the Connection manager, One for the FlatFile and one for the SQLServer Table..

While Executing the Package, I am getting the following error:

TITLE: Package Validation Error

Package Validation Error


ADDITIONAL INFORMATION:

Error at Data Flow Task [DTS.Pipeline]: The "runtime connection "OleDbConnection" (110)" in the connection manager collection, Connections, of "component "OLE DB Destination" (102)" does not have a value for the ID property. Verify that the ConnectionManagerID property of the runtime connection object has been set for the component.

Error at Data Flow Task [DTS.Pipeline]: component "OLE DB Destination" (102) failed validation and returned error code 0xC004801C.

Error at Data Flow Task [DTS.Pipeline]: One or more component failed validation.

Error at Data Flow Task: There were errors during task validation.

(Microsoft.DataTransformationServices.VsIntegration)


BUTTONS:

OK

Can anyone help me ?

Thanks in Advance,

Sundar

I guess this might help you:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=108043&SiteID=1

|||Make sure that in the flat file source and the destination that you have actually chosen a connection manager.

You'll get this error if you don't chose a connection manager.|||

Thanks a lot Phil and Unni.

Once I configured the connection Manager, Package worked like a charm !!

Cheers,

Sundar

Package Validation Error

I have a package which takes a text file and imports data into a SQL table. It returns the Package Validation Error:

Error at Data Flow Task [SQL Server Destination [1430]]: Failure inserting into the read-only column "ZipCode".

The column ZipCode in the sql table is a varchar(50) column. I do not understand why it thinks the column is a read only column; if someone could shed some light on this I would greatly appreciate it.

Thanks!

GN

Take a closer look at the table, perhaps it is a calculated column. What is the CREATE TABLE statement for that table?|||

It's not a calculated column. It is just a standard varchar column. The create table script generated by SQL is:

CREATETABLE [dbo].[ZipToState](

[ZipCode] [varchar](50)COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL,

[StateAbbr] [varchar](2)COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL,

[City] [varchar](50)COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL,

[PreferredCityName] [varchar](50)COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL,

[ID] [int] IDENTITY(1,1)NOTNULL

)ON [PRIMARY]

Actually I'm starting to think that it may actually be a permissions issue.

Thanks!

GN

|||

Okay I've confirmed that it is a permission error.

When I use my login, which has sysadmin privleges, for the database connection there is no problem inserting into the table fields. When I use another login that does not have sysadmin privleges then the package errors out stating the field is read-only. I've the login the bulkadmin server role but I get the same error. I've also given the login the explicit server permission Administer bulk operations but I keep getting the same error. Additionally the login has insert, delete, and select permissions for the table. So does anyone know what permissions are needed for the login to be able to perform the bulk insert via the package?

Thanks!

Package Validation Error

I have a package which takes a text file and imports data into a SQL table. It returns the Package Validation Error:

Error at Data Flow Task [SQL Server Destination [1430]]: Failure inserting into the read-only column "ZipCode".

The column ZipCode in the sql table is a varchar(50) column. I do not understand why it thinks the column is a read only column; if someone could shed some light on this I would greatly appreciate it.

Thanks!

GN

Take a closer look at the table, perhaps it is a calculated column. What is the CREATE TABLE statement for that table?|||

It's not a calculated column. It is just a standard varchar column. The create table script generated by SQL is:

CREATE TABLE [dbo].[ZipToState](

[ZipCode] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,

[StateAbbr] [varchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,

[City] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,

[PreferredCityName] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,

[ID] [int] IDENTITY(1,1) NOT NULL

) ON [PRIMARY]

Actually I'm starting to think that it may actually be a permissions issue.

Thanks!

GN

|||

Okay I've confirmed that it is a permission error.

When I use my login, which has sysadmin privleges, for the database connection there is no problem inserting into the table fields. When I use another login that does not have sysadmin privleges then the package errors out stating the field is read-only. I've the login the bulkadmin server role but I get the same error. I've also given the login the explicit server permission Administer bulk operations but I keep getting the same error. Additionally the login has insert, delete, and select permissions for the table. So does anyone know what permissions are needed for the login to be able to perform the bulk insert via the package?

Thanks!

Package Time Outs

Okay, we have are running our Master Package (and therefore all related Child packages) through a .bat file. The .bat file is scripted using the following logic for an entire month of daily runs:

Code Snippet

DTExec /FILE E:\ETL\FinancialDataMart\Master.dtsx /DECRYPT masterpwd /SET \Package.Variables[ReportingDate].Value;"2/01/2007" > E:\ETL\ErrorLogs\Processing\etl_20070201log.txt
IF NOT %ERRORLEVEL%==0 GOTO ERROR%ERRORLEVEL%
mkdir E:\ETL\ErrorLogs\Archive\20070201
move E:\ETL\ErrorLogs\Processing\*.txt E:\ETL\ErrorLogs\Archive\20070201


DTExec /FILE E:\ETL\FinancialDataMart\Master.dtsx /DECRYPT masterpwd /SU /SET \Package.Variables[ReportingDate].Value;"2/02/2007" > E:\ETL\ErrorLogs\Processing\etl_20070202log.txt
IF NOT %ERRORLEVEL%==0 GOTO ERROR%ERRORLEVEL%
mkdir E:\ETL\ErrorLogs\Archive\20070202
move E:\ETL\ErrorLogs\Processing\*.txt E:\ETL\ErrorLogs\Archive\20070202

etc...

Generally it takes about 40-45 minutes to run one days worth of data. However, we have found unpredictable instances where the job will take 3 hours or even 6 hours and appear to hang....

The weirdness sets in when we kill the job and rerun it. In all instances of a rerun, the job will execute in the normal 40-45 minute time frame. Obviously, we would like to institute some sort of logging, monitoring and error handling....including if need be a method to timeout a process and restart it.

I am reviewing the WMI (Windows Management Instrumentation) Task but I'm not entirely convinced that it's the right tool for the job.

Questions:

    Has anyone else experienced the type of processing behavior that I described? Has anyone been successful at using WMI or another process to monitor and timeout packages? If so, are there sample packages or a good tutorial that maps it out? Unrelated to this issue, we also have instances incomplete processing logs. The logs don't finish writing and the weird part is that they all end at the same point, does anyone have experience with incomplete job logs?:

    Code Snippet

    Progress: 2007-06-20 12:46:49.87
    Source: Update factFinancial Data Flow
    Cleanup: 11% complete

Thanks in advance!

Sounds like you're encountering some sort of deadlock. It's not clear to me whether or not you've enabled logging in all the packages; have you?

Have you installed SP2? We added some more logging in SP2 around calls to external databases.

You shouldn't need to kill the packages, but if you really, really want to go that way, you could use a Execute Process tasks (calling DTExec) instead of the Execute Package tasks, and set the TimeOut TerminateProcessAfterTimeout properties.

Package Stuck on "Pre-Execute phase is beginning."

Hi all,

I have a monster SQL query (SQL Server connection) that I'm trying to export to flat file. My package appears to just stick on the "Pre-Execute phase is beginning" stage. (I left it running overnight, and it's still going... no error message, I think it's just hung. If I include a "Top 10" in the my source query it completes without a problem. I'm wondering if it's an out-of-memory issue, but when I've run into this in the past there have been error messages generated. I've tried using both a Data Reader and an OLEDB Datasource. Does anyone have any ideas of what's going on and how I might fix it?

Thanks!!!

JessYou should run the query in management studio and look at its explain plan... Chances are that it has nothing to do with SSIS. Perhaps you have locks in place that are blocking parts of this query such that it never completes?

You could always wrap the query into a view and use the view as the source to your package. That might help the pre-execute phase.

Also, try setting your DelayValidation property to TRUE for the connection manager that's being used for this query.|||

Thanks for your response. I finally got it working. Our off-shore team actually wrote the query for this project and it was fairly convoluted. I just have the pleasure of actually trying to RUN it.

Anyhow, there was no descernable blocking from sp_who2 'active.' While switching the source query over to a view (per your suggestion), I discovered that the source sql inserted a number of records into a (non-indexed) table variable via a cursor. Placing this data into a permanent, indexed table seems to be the change that got the package working.

I've done some more testing, and whenever I use a table variable in the source query I seem to get stuck at "Pre-Execute." Is this by-design?

Thanks again for your input.

Jess

|||

Hi Jessica,

It is never by design to get stuck. Smile

Pre-execute is the time when your queries get prepared and all the problems in them are likely to show up at that point. Looking at the SQL Profiler might help you figure out what exactly is causing issues while queries are processed.

HTH.

Package not picking up latest Config File changes

Hi,

I've created a solution with 5 packages in. I've scripted a config file for each package where I would be able to change the source file connection and the database servers (source and destination). On the first package, I am able to change the config attributes and the affects are picked up, but the rest of the packages do no pick up the changes.

For example, if the source file is "C:\Files\source.txt" and this is changed to "C:\Files\source123.txt" (Just to make sure it fails) it wont pick up the new changes and still uses "C:\Files\source.txt".

Also, I tried changing the name of config file itself and the package still ran as if nothing had changed. It's as if it's not recognising the config file.

Any ideas ?

Thanks, Richie.

Can you check if you file connection has any PropertyExpression that changes the ConnecftionString back to "C:\Files\source.txt"?

Thanks,
Ovidiu Burlacu

|||

Hi,

There aren't any expressions for the file connections. The way I have the package set up is like this:

STAGE_PACKAGE - Takes the data from a flat file and inserts it in to a staging table.

TRANS_PACKAGE - Takes the data from the staging table and in to the database.

PARENT_PACKAGE - First calls the STAGE_PACKAGE then the TRANS_PACKAGE. An XML config file has been created to capture the location of the file to pick up and the server name to connect to. These are then passed to the STAGE and TRANS packages via parent variables.

I've tried deleting the PARENT_PACKAGE and recreating it, but exactly the same happens.

Richie.

|||I ran into this problem with Parent Package Variables, and switched to Environment Variables instead. You might want to give this a try.|||

This is a known problem of parent package variables. Try to use other venues, like Enviroment variables or registry settings if possible.

Thanks,
Ovidiu Burlacu

|||

That is issue was already documented a while ago; so if somebody is affected by this, go there and vote:

https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=126146

Rafael Salas

|||

Thanks for the replies.

As a work around to this, I decided to do away with the parent variables and set up a config file for the children. I thought that when the parent package is executed, it would call the first child which would then call the config file. But it didn't.

When I run the child package it picks up the config file, but when I run it from a parent package it doesn't pick it up. Is this the same issue as with the parent variables ?

Thanks,

Richie.

|||Anyone ?|||

I am not sure how your packages are set up; but If any of the configuration managers in children packages rely on the value to be received from the parent package to get configured; that simply will no work because the issue I mentioned in my previous post.

How are you running the packages? BIDS, SQL AGENT, DTEXEC?

Take a look to the execution progress to see if there is any warning about configuration no taking place.

Rafael Salas

Package not picking up latest Config File changes

Hi,

I've created a solution with 5 packages in. I've scripted a config file for each package where I would be able to change the source file connection and the database servers (source and destination). On the first package, I am able to change the config attributes and the affects are picked up, but the rest of the packages do no pick up the changes.

For example, if the source file is "C:\Files\source.txt" and this is changed to "C:\Files\source123.txt" (Just to make sure it fails) it wont pick up the new changes and still uses "C:\Files\source.txt".

Also, I tried changing the name of config file itself and the package still ran as if nothing had changed. It's as if it's not recognising the config file.

Any ideas ?

Thanks, Richie.

Can you check if you file connection has any PropertyExpression that changes the ConnecftionString back to "C:\Files\source.txt"?

Thanks,
Ovidiu Burlacu

|||

Hi,

There aren't any expressions for the file connections. The way I have the package set up is like this:

STAGE_PACKAGE - Takes the data from a flat file and inserts it in to a staging table.

TRANS_PACKAGE - Takes the data from the staging table and in to the database.

PARENT_PACKAGE - First calls the STAGE_PACKAGE then the TRANS_PACKAGE. An XML config file has been created to capture the location of the file to pick up and the server name to connect to. These are then passed to the STAGE and TRANS packages via parent variables.

I've tried deleting the PARENT_PACKAGE and recreating it, but exactly the same happens.

Richie.

|||I ran into this problem with Parent Package Variables, and switched to Environment Variables instead. You might want to give this a try.|||

This is a known problem of parent package variables. Try to use other venues, like Enviroment variables or registry settings if possible.

Thanks,
Ovidiu Burlacu

|||

That is issue was already documented a while ago; so if somebody is affected by this, go there and vote:

https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=126146

Rafael Salas

|||

Thanks for the replies.

As a work around to this, I decided to do away with the parent variables and set up a config file for the children. I thought that when the parent package is executed, it would call the first child which would then call the config file. But it didn't.

When I run the child package it picks up the config file, but when I run it from a parent package it doesn't pick it up. Is this the same issue as with the parent variables ?

Thanks,

Richie.

|||Anyone ?|||

I am not sure how your packages are set up; but If any of the configuration managers in children packages rely on the value to be received from the parent package to get configured; that simply will no work because the issue I mentioned in my previous post.

How are you running the packages? BIDS, SQL AGENT, DTEXEC?

Take a look to the execution progress to see if there is any warning about configuration no taking place.

Rafael Salas

Package loses configuration files!

I created a project with multiple packages. I created a global config file that all packages reference. I also created a config file for each package that contains the package specific settings. I checked it all into TFS. I went to a different machine and checked it all out. Upon loading a package into the IDE I got the following 3 warnings.

Warning loading MT_LSE_PROD_StageLoad.dtsx: The configuration file "MT_LSE_PROD_StageLoad.dtsConfig" cannot be found. Check the directory and file name. e:\contentloader\sprint1a\MT_LSE_PROD_StageLoad.dtsx

Warning loading MT_LSE_PROD_StageLoad.dtsx: The configuration file "Environment.dtsConfig" cannot be found. Check the directory and file name. e:\contentloader\sprint1a\MT_LSE_PROD_StageLoad.dtsx

Warning loading MT_LSE_PROD_StageLoad.dtsx: Failed to load at least one of the configuration entries for the package. Check configurations entries and previous warnings to see descriptions of which configuration failed. e:\contentloader\sprint1a\MT_LSE_PROD_StageLoad.dtsx

Any ideas? And yes, the files *ARE* there! I tried building the project and it builds. I also tried doing a deployment build and that too built. I am so confused!!!

Are you using SQL2005/Express SP1? As I know there used to be a known issue with SP1: if you use configuration file to deploy SSIS package, the referenced file names in the configuraiton file will be change to lowercase. Since XML file is case-sensitive, the files can't be found if the name does not exactly match. Not sure whether there is any hotfix available now.|||We are using SQL 2005 Enterprise. I believe it is SP1 as well. Will see if naming everything in lower case only fixes it up. Thanks.sql

Wednesday, March 21, 2012

Package hangs on parallel "upserts"

I have data flow tasks, one which validates the import file and one which processes the import file if the validation passed. The validation runs and pushes the three row types to three different recordset destinations. When I enter the processing data flow task, I have three parallel trees processing each recordset saved in the previous task. I'm using a script component to generate the rows which are then sorted and merged with the production database to find existing records. Based on this, I split to an OLE DB command (running an UPDATE command) or OLE DB destination (to simply insert the records.)

In this particular case, all records are being updated and nothing is being inserted new. Two of the three trees will complete the sort but hang on the merge, split, and OLE DB command components. The other will do the same but also hang on the split.

In another case, I truncated each destination table before running the package and the package runs fine.

Are toes being stepped on in the data flow task and causing a deadlock?

Update: I removed the sort transformation and sorted the rows before pushing them to the recordsets and I still get the same results.

If you were getting a deadlock then I think SQL Server would capture this and throw an error. That's not what's happening here. Its more likely that a lock is occurring. Use sp_who2 and sp_lock to determine if blocking is occurring.

-Jamie

|||

A common issue when your modifying the same table with multiple paths in your data flow is that you will have a deadlock on the destination table. To resolve:

1) choose "Fast load" and "Don't lock table" in the OLEDB Destination

2) ensure that you don't have a race condition between the paths, i.e. that the updates depends on the inserts _in the same run_ have reached the DB before the updates.

Cheers/Kristian

|||I checked the sp_who2 stored procedure and nothing was in the BlkBy column.
Fast load is being used and I chose "don't lock table" but that didn't fix the issue.
The updates don't depend on the inserts. I merge the input with production on keys and split on NULL values in production.

I ended up splitting the three trees into three data tasks to see if that would fix the issue but I get the same results, only this time I noticed no all rows have left the merge. The merge is a left outer join on keys and in this case all records input are already in the database. So I'm joining 62k input records with the same number of (identical) production records and 9,937 rows leave the merge. Also, when running the package, that task remains yellow.
|||

The solution is pretty simple. Do the inserts and updates in seperate data-flows. You can use raw files to pass data between data-flows.

-Jamie

package goes into loop while reading paradox file

I have an SSIS package which is suppose to read this db file and insert it into the SQL 2005 table. But when I execute this package it goes into a loop, reading the file over and over again. The package fails after inserting some 10 million rows ( actual rows are not more then 270,000) giving the error msg'Not enough space on temporary disk'.

On examing the data transfered into SQL there are duplicate rows.

I also used the import export wizard (thinking there might be some error in Package code) provided by SQL to transfer the data from .db to SQL but it has the same result(goes in a loop).

I would appreciate any help in this problem. Let me know if you have any other questions.

Thanks

Do you have anything except the source and the destination in your package?

Thanks.

|||

If incase you are successful extracting 5-10 rows from a paradox file, The same must do for 270000 as well. Please also check if ending rows were corrupt.

|||

Hello Bob,

The package which I have created has source, conversion and destination components. The conversion component converts nvarchar field to varchar. This package works on another .db file say file 'YYY.db'. but then fails to read file 'ZZZ.db'.

The file does not seem to be corrupted coz when I try to view it through a paradox viewer it gives me the correct record count, field names etc and also displays the data correctly. We also have another viewer through which the file attributes and data are correctly displayed. Therefore it seems that ZZZ.db is not corrupted. The package also reads the file but then it fails to stop reading, seems as if it thinks there is no end of file. But the other independent viewers are able to read the file.

My computer does have the latest updates. Any ideas/suggestions/thoughts would be greatly appreciated.

Thanks,

Smoky

package fails after OnPreValidate (but not in BIDS)

I have all the package logging tickboxes checked but in the log file I only get this:
OnPreValidate,<machinename>,<account>,<packagename>,{7741AD7F-1941-4F4C-AE9D-08068C8856E4},{F6924552-600A-450B-995F-C24AB5C49FC3},6/25/2007 5:21:12 PM,6/25/2007 5:21:12 PM,0,0x,(null)

Then nothing.

The package fails.

When I run it in BIDS I do not get an error.

Anyone know why?If you're not running the package in BIDS, how are you running it? Agent?|||it is running as an all powerful system user|||Right, but is DTEXEC being called from Agent, from a command prompt, or are you using a custom app to run the packages?|||

think this was similar problem to the other issue I was having

we changed the security level to use password, deploy and it works fine now

shame the error couldn't have been more informative Sad

|||

adolf garlic wrote:

think this was similar problem to the other issue I was having

we changed the security level to use password, deploy and it works fine now

shame the error couldn't have been more informative

Well, that's why John and I asked the question that you never answered.... We already knew what the problem was, but needed you to confirm and walk through the problem on your own.

Tuesday, March 20, 2012

PACKAGE ENCODING

Hi, a new (little..) problem with xml source.

I have to import large xml file in sql server and I use data transfer task and xml source.

The xml file are generated without specifing any encoding, and so I obtain many character error if i don't change the encoding.

When i put "by hand" (with a xml editor) in the xml file this encoding <?xml version="1.0" encoding="ISO8859-1"?>, the SSIS task works perfectly without any error.

So, i'm looking for a way to use this encoding without editing xml file (more than 500 mb...). the way that i can imagine are:

1) change the package encoding (but I haven't find this kind of settings)

2) change the xml source encoding (but I haven't find this kind of settings)

3) change the console chcp (normally i have 850, i have tested 1252 but without any success)

4) make a xml trasformation (but i don't know the best way); I've tried with XML task without any success...

could anybody help me?

thank you in advance

alessandro

Hi, an update.

I've found a little "workaround" using XML task - patch. In this way I can patch the original xml file with a diff xml file that contain only the encoding settings. The ouput is a sort of "merge" of the xml file and the new enconding.

the real problem is that this package fail because it parse the xml file before create the new one...and it find the illegal character (that are the reasons for wich i want to change the xml encoding...!!!).

I've seen that it works making a test (replacing the illegal caracther "by hand").

So i have founded another way to obtain the same problem....

Please..:! someone help me!

thank you in advance

Alessandro

|||

....Nobody help me?

Now i'm working on .net xml source with scripting...

lavez

Package deployment question

Hi,

I have a standalone SSIS package that I wish to deploy from the file system, as opposed to SQL Server deployment.

Anyways, I was wondering if there's a command line utility for running SSIS packages on an ad hoc basis? What I was thinking was that I would put the call to the command line in a stored procedure using xp_cmdshell, so that the package can be called that way.

Am I thinking about this correctly?

Thanks

dtexec.exe is what you'll need to run from the command line.|||

Thanks. Is this what people generally use for file system deployment?

Question:

Do I need to do anything special with the package first? Or just call it with dtexec.exe from wherever the pkg resides on the file system?

|||

I guess I am wondering about protection levels... and how to work with protection levels when using file system deployment?

I find this all very confusing!

Are there any good articles that discuss all the in's and out's of proper pkg deployment?

|||

Yes, dtexec is what is used to execute from the file system (and from the database). DTEXEC is the runtime engine for SSIS.

If you are running from the file system, I'd recommend setting the ProtectionLevel to SaveSensitiveWithPassword, and use the Decrypt switch to pass it to the package.

|||

Ok thanks.

Q. Concerning "sensitive" data, I'm not 100% sure what constitutes sensitive data? FTP conn mgr passwords and OLE DB conn mgrs using SQL Server logins where you must specify the passwords?

Q. Another question: What if the OLE DB conn mgrs use Windows Authentication to connect to the db and someone else logs on to the server to run the package (and they don't have write permissions to the database)? Will this cause the conn mgr not to be able to connect t the db? In which case, even having a password won't help, right?

Q. But it seems that file system deployment forces the issue, so that I MUST set up a password if I am not deploying to SQL Server, correct?

Thanks much

|||

Hi,

I set the encryption to "EncryptSensitiveWithPassword", and attempted to run this from Management Studio:

EXEC master..xp_cmdshell 'DTExec /FILE "D:\myPackage.dtsx" /DECRYPT "password"'

It appears to run, but these are some of the errors I get. It looks like both it doesn't like the key, and also that SSIS isn't properly installed on this server. Two separate issues, but how can I resolve the key problem first?

Thanks

output

NULL

(1 row(s) affected)

output

Microsoft (R) SQL Server Execute Package Utility

Version 9.00.1399.06 for 32-bit

Copyright (C) Microsoft Corp 1984-2005. All rights reserved.

NULL

Started: 7:56:53 PM

Error: 2007-06-18 19:56:53.89

Code: 0xC0016016

Source:

Description: Failed to decrypt protected XML node "DTSStick out tongueassword" with error 0x8009000B "Key not valid for use in specified state.". You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that

the correct key is available.

End Error

Error: 2007-06-18 19:56:54.56

Code: 0xC0202009

Source: myPackage Connection manager "myServer.myDB.mySQLServerLogin"

Description: An OLE DB error has occurred. Error code: 0x80040E4D.

An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E4D Description: "Login failed for user 'mySQLServerLogin'.".

End Error

Error: 2007-06-18 19:56:54.58

Code: 0xC0012024

Source: Script Task

Description: The task "Script Task" cannot run on this edition of Integration Services. It requires a higher level edition.

End Error

Warning: 2007-06-18 19:56:54.58

Code: 0x80019002

Source: OnError

Description: The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCoun

t or fix the errors.

End Warning

Error: 2007-06-18 19:56:54.58

Code: 0xC00291EC

|||It seems as though the passwords do not match. CaSE matters. And ensure they match. You did specify a password in the package properties box, right?|||You also have a permission issue to connect to the SQL Server DB. Make sure that whatever user is running the package has rights to connect to the SQL Server|||

Yes, the password is set on the package. It's just "password", so it's no brainer.

So, Rafael, you're saying that the /Decrypt switch is not related to the rights to connect to the server?

I thought that having a password on the package would take care of all the access issues. That is, all the connections are already set in the package, so whoever runs it doesn't need to worry about the underlying connections?

So, how do I resolve this?

Basically, I am using a SQL Server user with read/write permissions in the database for all my OLE DB conn mgrs. Why does this connection fail when I use DTExec? What piece am I missing here?

Need help! Thanks.

|||Have you just tried running this on your own with dtexec on the command line? Without using xp_cmdshell?|||

I think these are 2 separate issues. The package password will give the rights to read and execute whatever is inside of the package. I made my coment based on other error I see in the log:

An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E4D Description: "Login failed for user 'mySQLServerLogin'.".

It looks like the login credential to connect to that SQL Server are not right.

|||

Rafael Salas wrote:

I think these are 2 separate issues. The package password will give the rights to read and execute whatever is inside of the package. I made my coment based on other error I see in the log:

An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E4D Description: "Login failed for user 'mySQLServerLogin'.".

It looks like the login credential to connect to that SQL Server are not right.

But that would be because the sensitive information couldn't be decrypted, no?|||Sadie,
Try using EncryptAllWithPassword instead.|||

The package is set up with OLE DB conn mgrs that use SQL Server users.

When I run DTExec.exe, I am doing so as my Windows Login, which has admin rights on the server. I've never had permissions problems with the database.

Package Configurations - Using One config file for Multiple Packages

Hi All,

I am working on a project currently where we have many SSIS packages and we want to minimize the number of config files to be used. What I was thinking was using one master config file which will have Server/Database info which will be used by all the packages. Now the thing is there are multiple packages which are being executed from within a master package and I was wondering if this will be an issue. Is there anything else that I will have to keep in mind? Any help is appreciated.

Thanks

If my memory serves correctly then you will get lots of warnings about references in your config file to properties that don't exist in your package. But it should still execute OK.

-Jamie

|||Thanks Jamie