Showing posts with label import. Show all posts
Showing posts with label import. Show all posts

Monday, March 26, 2012

Package won't run correctly when called

I have a stiuation where I need to import data from Excel and of course I'm using DTS to do so. Because the spreadsheets have a variety of datatypes in them, I have to make sure that DTS recognizes certain rows as INT so they will import properly, to do so I add 5 rows of 1's to the top lines of the individual sheets in the correct columns. After the 1's are entered I import to staging tables and then into the production tables. The package itself is run from a job and the job is called by a stored procedure running sp_start_job. Here is a sample of the ActiveX script I am using to alter the spreadsheets:

*****************************************************************
Function Main()

On Error Resume Next

Dim Excel_Application
Dim Excel_WorkBook
Dim Excel_WorkSheet

Dim sFilename
Dim sSheetName

sFilename = "\\Share\LCS Database Information - Data Test.xls"
sSheetName = "Experience Spreadsheet"
Set Excel_Application = CreateObject("Excel.Application")

Excel_Application.DisplayAlerts = False

' Open the workbook specified
Set Excel_WorkBook = Excel_Application.Workbooks.Open(sFilename)

Excel_WorkBook.Worksheets(sSheetName).Activate

FOR x = 2 TO 6 STEP 1
FOR i = 1 TO 5 STEP 1
Excel_WorkBook.WorkSheets(sSheetName).Cells(i,x).Value = 1
NEXT
NEXT
*************************************************************************

It does this to 3 sheets and then the import steps run.

Now, when I execute the package in Enterprise Manager it works exactly as I want. However when I run it through the stored procedure, the ActiveX doesn't run properly. My stored proc looks essentiall like this:

****************************************************************************
--Creat TEMP table for the DTS package to store values from the spreadsheet
EXEC pCreateTEMPTables

IF @.@.ERROR <> 0
BEGIN
SET @.Segment = 1
GOTO sqlerror
END

--Run the DTS package that will pull data into temp tables so the individual --data points can be used to search up keys
EXEC msdb.dbo.sp_start_job
@.job_name = 'LCSRunDTS'

EXEC more.procs
****************************************************************************

One thing about the spreadsheets is that they have links in them wanring messages get thrown up during the ActiveX script about updating the links. I tried to turn them off but one message, "One or more links can't be updated" which comes up if you click "Update" on the previous warning or supress warnings, just won't go away.

Any ideas? Thanks
Well, I got the first problem above figured out. When I ran the package directly from Enterprise Manager, the ActiveX object was created on my local system and used Excel there to do what it needed to. When I tried to run the package on the server which didn't have Excel installed, my script failed. My new solution (as soon as it works) is to use WshRemote to run the script to alter the spreadsheets using a computer with Excel installed. The following script works when run from my PC but fails with the message "Error Source = Microsoft VBScript runtime error, Error Description: Permission Denied" when run as part of the DTS package called from a stored procedure via dtsrun:
--
Dim Controller, RemoteScript
Set Controller = CreateObject("WSHController")
Set RemoteScript = Controller.CreateScript("\\Server\Share\ExcelAdd1s.vbs", "MyLocalPC")
RemoteScript.Execute

Being new to the DBA world, I need some help with permissions. Is it that the SQL Server account doesn't have access to the shared drive, or to my PC? Where can I look to see what user is running the DTS package? Are there any questions that I'm not asking that I should be? Any help would be appreciated.|||When yuo run the package from EM directly your logged in account needs to have permission to the share. When it runs from an agent job the SQL Agent account must have permission to do the same thing, it must be to a UNC share and not a mapped drive

Package Wont Open

I have just created a package to import data from excel files, and perform some simple sql statements. I saved the package, and when I try to re-open it, I get the following error:

"Error Source: Microsoft OLE DB Provider for SQL Server
Error Description: [DBNETLIB][ConnectionOpen (Connect()).] SQL Server does not exist or access denied."

And after I hit th "OK" button, I get the next error msg:

"The selected package cannot be opened. The DTS Designer has been closed"

Is my package completely lost? Can I open it elsewhere, or by changing some properties somewhere? Also, is there something I can do to avoid this in the future?! I hope all my work isn't lost ...

Any help would be greatly appreciated!

BrianClose Enterprise Manager, reopen it, and see if you can read the DTS package. You can still ping the server can't you?|||Thanks for the quick response.

I closed and re-opened enterprise manager, to no avail. I'm working with my localhost and there is no problem pinging. Everything else seems fine, just can't open this package. Any other thoughts?|||It sounds like it got corrupted somehow. I've never actually had this happen. Been doing DTS packages a long time. If you had saved it previously, right-click on the package and look at versions. See if you can look at any of the previous versions.|||I have no idea what's going on, craziness I say. I only had one version when I checked. I've gone through creating these packages all different ways. They save no problem, but I can't re-open them. Any other thoughts? Thanks!sql

Wednesday, March 21, 2012

Package hangs on parallel "upserts"

I have data flow tasks, one which validates the import file and one which processes the import file if the validation passed. The validation runs and pushes the three row types to three different recordset destinations. When I enter the processing data flow task, I have three parallel trees processing each recordset saved in the previous task. I'm using a script component to generate the rows which are then sorted and merged with the production database to find existing records. Based on this, I split to an OLE DB command (running an UPDATE command) or OLE DB destination (to simply insert the records.)

In this particular case, all records are being updated and nothing is being inserted new. Two of the three trees will complete the sort but hang on the merge, split, and OLE DB command components. The other will do the same but also hang on the split.

In another case, I truncated each destination table before running the package and the package runs fine.

Are toes being stepped on in the data flow task and causing a deadlock?

Update: I removed the sort transformation and sorted the rows before pushing them to the recordsets and I still get the same results.

If you were getting a deadlock then I think SQL Server would capture this and throw an error. That's not what's happening here. Its more likely that a lock is occurring. Use sp_who2 and sp_lock to determine if blocking is occurring.

-Jamie

|||

A common issue when your modifying the same table with multiple paths in your data flow is that you will have a deadlock on the destination table. To resolve:

1) choose "Fast load" and "Don't lock table" in the OLEDB Destination

2) ensure that you don't have a race condition between the paths, i.e. that the updates depends on the inserts _in the same run_ have reached the DB before the updates.

Cheers/Kristian

|||I checked the sp_who2 stored procedure and nothing was in the BlkBy column.
Fast load is being used and I chose "don't lock table" but that didn't fix the issue.
The updates don't depend on the inserts. I merge the input with production on keys and split on NULL values in production.

I ended up splitting the three trees into three data tasks to see if that would fix the issue but I get the same results, only this time I noticed no all rows have left the merge. The merge is a left outer join on keys and in this case all records input are already in the database. So I'm joining 62k input records with the same number of (identical) production records and 9,937 rows leave the merge. Also, when running the package, that task remains yellow.
|||

The solution is pretty simple. Do the inserts and updates in seperate data-flows. You can use raw files to pass data between data-flows.

-Jamie

Tuesday, March 20, 2012

PACKAGE ENCODING

Hi, a new (little..) problem with xml source.

I have to import large xml file in sql server and I use data transfer task and xml source.

The xml file are generated without specifing any encoding, and so I obtain many character error if i don't change the encoding.

When i put "by hand" (with a xml editor) in the xml file this encoding <?xml version="1.0" encoding="ISO8859-1"?>, the SSIS task works perfectly without any error.

So, i'm looking for a way to use this encoding without editing xml file (more than 500 mb...). the way that i can imagine are:

1) change the package encoding (but I haven't find this kind of settings)

2) change the xml source encoding (but I haven't find this kind of settings)

3) change the console chcp (normally i have 850, i have tested 1252 but without any success)

4) make a xml trasformation (but i don't know the best way); I've tried with XML task without any success...

could anybody help me?

thank you in advance

alessandro

Hi, an update.

I've found a little "workaround" using XML task - patch. In this way I can patch the original xml file with a diff xml file that contain only the encoding settings. The ouput is a sort of "merge" of the xml file and the new enconding.

the real problem is that this package fail because it parse the xml file before create the new one...and it find the illegal character (that are the reasons for wich i want to change the xml encoding...!!!).

I've seen that it works making a test (replacing the illegal caracther "by hand").

So i have founded another way to obtain the same problem....

Please..:! someone help me!

thank you in advance

Alessandro

|||

....Nobody help me?

Now i'm working on .net xml source with scripting...

lavez

Package crashes without error information

I am creating an SSIS Package to import some data. The package contains a big for each loop with 3 parallel running nested for each loops. The package crashes within the big loop without returning any error information. It starts up the sqldumper and genreates an dump - but does not report any error to the UI (Visual Studio or dtexec). While I am running only one for each loop and disable the others the package executes successfully. Has anyone an idea what might be wrong ? or is the only chance to submit a bug report ?

Thanks
HANNES

If it's crashing the package and generating a dump then a bug report is the way to go. This is something we will want to catch and fix for sure.

Thanks

Donald

Monday, March 12, 2012

Package Configuration Error

Not sure if you guys have run across this error. I created a simple SSIS package to import from a csv file into SQL Server, and I made it configurable so I can point to any of my 4 destination SQL Servers.

In the error list pane, i'm getting this warning:

Warning loading Package.dtsx: Table "[dbo].[SSIS Configurations]" does not have any records for configuration. This occurs when configuring from a SQL Server table that has no records for the configuration.

The funny thing is, the table is populated with 2 properties, Catalog & ServerName, but at run time, its like the Package ignores the SQL table to apply configuration properties.

Any help would be greatly appreciated.

-- J

Might it be looking at a table in another database than what you are expecting?|||What Phil said. Every time I've encountered this before, it's been because the connect string that pointed to the database with the configuration table wasn't correct.

Saturday, February 25, 2012

Overwrite flat file destination does not work

I have 3 packages that run consecutively in a main package. Each of these packages read data from a flat file and import it into the Database and then perform some updates and then I use a script component to write the data to a flat file destination and have overwrite flag on the flat file destination to false, however the flat file is not being appended to. All the 3 packages are set to wirte to the same flat file destination each package overwrites the content of the flat file created by the previous packages.

I am wondering why the overwrite = False does not work on the flat file destination. Is there something else that I need to set or is this a defect? Any inputs will be much appreciated.

Thanks,

M.Shah


It works fine for me, I can append to an exiting file from multiple packages. Double-check all your destinations for the setting. I am using SP1 + Hotfix, but I don't remember this being an issue before or having been fixed.|||

I have double checked and I have Overwrite = False for all the three flat file destinations and the flat file still gets overwritten. I have SP1 with no hotfix.

Any further inputs will be much appreciated.

Thanks,

M.Shah

|||I got it to work. Not absolutely certain on what was wrong.