Friday, March 30, 2012
PAE SQL server questions...
I'm wondering what command I could issue to see the amount of memory a named instance is using from within the instance itself. We've enabled address extensions (PAE), and task manager no longer shows the correct amount of mem being used by the process. Under perfmon, I've added the object MSSQL$INSTANCENAME:Memory and I'm looking at the Total Server Memory. I'm seeing 4 gigs, when max memory is capped at 2gigs. This must show the server memory and not the instance memory ? Is there some way to see the instance memory ?
Cheers,
-KilkaDBCC MEMORYSTATUS. It'll give you enough to start with. Coupled with MEMUSAGE, you can pi-point the exact object that hogs the memory.|||Thanks rdjabarov,
I'll have to check this out on our testing environments. I've read that MEMUSAGE can cause the instance to crash, I'll have to do some more testing when I get back to work to verify that it'll be safe to use on live.
Cheers,
-Kilka
Monday, March 26, 2012
Package xml config files used in child packages?
If you are using the Execute Package task to execute child packages it looks like you bumped into the current SSIS limitation. Take a look at the following posting:
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2078371&SiteID=1
Regards,
Yitzhak
|||Seems like that problem is to do with setting variables at the command line in child packages?
|||Im just curious if I use a config file that specifies a connection to the database to use, whats the best way to pass that connection on to the child packages?
You can pass information through the parent package variable (PPV) configurations. However, PPVs are applied after all other configurations, so this can cause some problems. For example, if you are trying to pass the connect string to a database to use the SQL Server configurations, the PPV isn't applied until after the SQL Server configurations are already done.
Another way to accomplish this is to use an Execute Process task instead of the Execute Packge task. Then you can call DTEXEC, and pass along whatever connection strings you want to use.
Finally, you can use a Script Task to load the child package from the SSIS object model, set values in the package, and launch it from that.
Package xml config files used in child packages?
If you are using the Execute Package task to execute child packages it looks like you bumped into the current SSIS limitation. Take a look at the following posting:
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2078371&SiteID=1
Regards,
Yitzhak
|||Seems like that problem is to do with setting variables at the command line in child packages?
|||Im just curious if I use a config file that specifies a connection to the database to use, whats the best way to pass that connection on to the child packages?
You can pass information through the parent package variable (PPV) configurations. However, PPVs are applied after all other configurations, so this can cause some problems. For example, if you are trying to pass the connect string to a database to use the SQL Server configurations, the PPV isn't applied until after the SQL Server configurations are already done.
Another way to accomplish this is to use an Execute Process task instead of the Execute Packge task. Then you can call DTEXEC, and pass along whatever connection strings you want to use.
Finally, you can use a Script Task to load the child package from the SSIS object model, set values in the package, and launch it from that.
Friday, March 23, 2012
Package Update and Build Process
My SSIS solution has about hundred packages and time to time I have to edit a package. I understand I could use 'Build' command to compile only updated package, as opposed to Rebuild which recomplies all of the packages.
Nevertheless, in both cases SSIS opens all of the packages in design environment before compilation. My packages are saved in SourceSafe and that process takes quite long and I was wondering if there was any other way to compile only updated package where none of the other packages are opened during Build/Rebuild process? For example we could use dtutil to deploy only updated packages without running Package Installation Wizard.
Turn the of the "Build deployment Utility" option ala http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=874332&SiteID=1. With this option disabled, each package will cease opening every time you debug just one package via F5 or select the build project or build solution menu items.
For that matter, turn off the Integration Services project "Build" option in Visual Studio's Configuration Manager. SSIS in BIDS doesn't compile/build anything, but rather, copies your hundred .dtsx files to the project relative "bin\" subdirectory. Its doubtful you need four copies of each of the hundred packages, one each in source control, and three each in your local workspace, two of which are superflous (e.g. those copies in bin\ and bin\Deployment)
As you mentioned, use dtutil, or xcopy for that matter (if appropriate) for deployment, rather than the Package Installation Wizard. For example see http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1828408&SiteID=1, wherein dtutil is used for SQL server deployment.
|||Thanks very much, your suggested approach would save me painful waiting time I had to endure before.
Asaf
sqlPackage randomly stops
I have a very weird issue in my latest package. I run it and it just randomly stops. When I watch it in debug mode or not in debugging a command prompt window will flash for an instant (too fast to read) and then the package will stop. It stops inside of a for each loop and the "on completion" is never fired for the loop. I never receive any errors - its just like someone hit the stop button. Sometimes it will process hundreds of files before stopping, other times only a few. (And its not stopping on the same file each time.. it doesn't appear to be related to the file at all)
Any ideas what could be going on? How to troubleshoot?
The process might have crashed (the console window is SqlDumper collecting the crash information). Check if C:\Program Files\Microsoft SQL Server\90\Shared\ErrorDumps\ contains any dumps.Also see this thread
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=541734&SiteID=1
Make sure you install the SP1, and if the problem still occurs, contact product support services.|||
Yes that folder contains files from the times it stopped - what can I read them with?
That thread mentions turning on error reporting - do you do that on the server or my dev workstation? (and where do I find that setting?)
Another problem I have (maybe related?) is that about once a day visual studio will just "disappear" while I am working on a package .. (or sometimes I am not even working on it) however there are no files for those occurances.
Yeah, I've been using SP1 for quite some time.
Thanks
Chris
Chris Honcoop wrote:
Yes that folder contains files from the times it stopped - what can I read them with?
That thread mentions turning on error reporting - do you do that on the server or my dev workstation? (and where do I find that setting?)
The files are minidumps that can be read by debuggers, like Visual Studio. I would not recommend you to try to read them - they are meant for product devepers, not for users (unless you've built a custom task or transform and suspect it might be involved in the crash).
Use SQL Server Error and Usage Reporting tool to enable "Send error reports..." option - this will allow development team and product support to analize the problem and likely provide a fix in next SP. If you need solution sooner, contact Product Support Services.
|||The only custom objects I have installed is the RegEx one available from microsoft download - however I am not using it in this package (as it wouldn't work unless I was running the package from my workstation). I did uninstall it - however its still in my list of available objects if I go into "choose items..."|||Well I ran the package directly on the server and it crashed as well so it doesn't appear to be something specific to my machine.
Overview of the dataflow component that fails is I have a flat file that has data in the header I need to keep and tack on each row. (see my thread on cross join) as well as one set of meta data that determines which rows in the file I need to keep. So I take the file and split it four ways - one of the split is basically all data except the metadata. I join one set of metadata to the main data to retrieve the rows of data I need. Then I also add a column to each datastream and hardcode it to 1 and join all the metadata back to the main datastream (since there is no cross join function).
Funny thing is I have been watching what it looks like when it crashes and the file and the split always work but the sorting before joining back into the merges or the merges themselves is where it seems to die (if indeed the graphical view at the time of crash is correct).
How do I know if it is sending the crash info to msft? Its crashed 5 times since turning that on and only once did it bring up a box saying visual studio encountered an error.
|||Tried some more things as well... still random crashes: I've remade the package from scratch, restructured the dataflows 4 or 5 different way.. still random "dissappearing act" and memory dump to that location.
|||Please contact product support services - they'll be able to analize the crash dumps and provide a solution for the problem, or temporary workaround until the full solution is developed.It is hardly possible to analize crashes via forum.
Wednesday, March 21, 2012
Package hanging on OLE DB Command with large result sets
I have an SSIS package (SQL 2005 SP2 and Visual Studio SP1) that does the following:
OLE DB Source --> Conditional Split --> OLE DB Command #1 --> OLE DB Command #2
The source reads from database A. Each row is variable-width and up to several KB wide, including two ntext columns.
Command #1 executes a stored proc in db A, using a bunch of inputs and two output parameters.
Cmd #2 executes an update in db B, using the two output params from cmd #1 as inputs.
When the rowset size is small, around 500, everything works fine.
However, when the rowset size is larger, around 5000, SSIS hangs when trying to execute cmd #2. The profiler shows that none of the cmd #2 updates are ever executed. No error messages are produced, and the connection never times out -- it just hangs forever.
If I replace the cmd #2 updates with a simple select, everything works fine. If I replace it with a stored proc that does an update, it hangs.
The work-around I came up with was to create a new table in db B, and do inserts into the table, but unless I'm missing something, this still seems like a bug...
Do you have the RetainSameConnection property set to true on the OLE DB connection manager you are using? Try that if not.Wednesday, March 7, 2012
Overwriting the backup media
Is there a command to overwrite the backup mdeia (I tried the NOSKIP
option but that doesn't serve the purpose). I want the command equivalent to the checkbox in the backup window that says "Overwrite existing media".
Thanks.don't know in which languagebut if you're using sqLDMO
in VB it's like this...
dim oBackup as new SQLDMO.Backup
oBackup.initialize = true
(This will put the backupdevice to 'overwrite')|||To over write using the BACKUP command, you need to specify 'SKIP, INIT' If you want to oblitirate the backup AND any other backupsets written across the tape (stripped set) then use FORMAT. Let us know if you are still having problems.|||Hi Paul,
That seems to work.
Thanks a lot.