Friday, March 30, 2012
Page Borders
pages. We would like to have a border around each page, including the page
header and page footer. We've set a body border, but that doesn't draw
horizontal lines at the top an bottom of each page.
We've tried putting our controls in a rectange and expanding the rectange to
be encompass a page both vertically and horizontally, but the page header is
excluded from this border (plus we suspect we'll have additional problems if
our individual controls causes a page break). Also, though we have the
rectange control's height set to 9 inches, it shrinks such that it ends
after the last control in the region.
Any suggestions for defining a border on each page, the first page excluded
(it's a cover page)?
We're using MS Reporting Services 2005.
Thanks,
ChrisNot sure if this does exactly what you're looking for, but in addition
to the body border, have you tried putting horizontal lines in the page
header & page footer?
Eva Pierce Monsen|||Thanks for Eva's informative inputs,
Hi Chris.
As Eva has mentioned, you can explicitly customize the SSRS report's header
and footer sections in the VS 2005's report designer. Just right click on
the report designer's empty area and you'll find the "Page Header" and
"Page Footer" option in the context menu, select either one to edit the
template.
#Adding a Page Header and Footer to a Report
http://msdn2.microsoft.com/en-us/library/ms159677.aspx
BTW, as for the body's border setting, based on my local test, it also
works (display both horizental and vertical lines). I simply set the border
width and border style (solid) and the runtime report displays both
horizental and vertical border lines for the report body. I think the
problem you met is possibly specific to report body's size.
Hope this helps.
Regards,
Steven Cheng
Microsoft Online Community Support
==================================================
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
==================================================
This posting is provided "AS IS" with no warranties, and confers no rights.
Get Secure! www.microsoft.com/security
(This posting is provided "AS IS", with no warranties, and confers no
rights.)|||Steven,
The horizonal body borders appear when we first render the report. However,
when we export the report to PDF, the horizontal lines at the top & bottom
of the body do not appear. Also, the border does not encompass the header
and footer, so even if the horizontal lines did appear, it isn't quite what
we're looking for. The prototype I'm working off of is a MS Word document
that has a page border that does encompass the header & footer.
Any other suggestions?
Thanks,
Chris
"Steven Cheng[MSFT]" <stcheng@.online.microsoft.com> wrote in message
news:NasHAxdjGHA.4528@.TK2MSFTNGXA01.phx.gbl...
> Thanks for Eva's informative inputs,
> Hi Chris.
> As Eva has mentioned, you can explicitly customize the SSRS report's
> header
> and footer sections in the VS 2005's report designer. Just right click on
> the report designer's empty area and you'll find the "Page Header" and
> "Page Footer" option in the context menu, select either one to edit the
> template.
> #Adding a Page Header and Footer to a Report
> http://msdn2.microsoft.com/en-us/library/ms159677.aspx
> BTW, as for the body's border setting, based on my local test, it also
> works (display both horizental and vertical lines). I simply set the
> border
> width and border style (solid) and the runtime report displays both
> horizental and vertical border lines for the report body. I think the
> problem you met is possibly specific to report body's size.
> Hope this helps.
> Regards,
> Steven Cheng
> Microsoft Online Community Support
>
> ==================================================> When responding to posts, please "Reply to Group" via your newsreader so
> that others may learn and benefit from your issue.
> ==================================================>
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
>
> Get Secure! www.microsoft.com/security
> (This posting is provided "AS IS", with no warranties, and confers no
> rights.)
>|||Hi Chris,
Thanks for your response.
Yes, the body border solution has its natural limitation since it does not
include the Header and Footer sections. Have you tried Eva's suggestion on
customizing the UI through the Header and Footer template in the report
designer? As I've also mentioned in the previous message, you can right
click on the report designer's empty area and you'll find the "Page Header"
and "Page Footer" option in the context menu, select either one to edit the
template. Thus, you can put some report item(such as Line ) into the header
or footer section to simulate a border line. How do you think of this?
Regards,
Steven Cheng
Microsoft MSDN Online Support Lead
==================================================
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
==================================================
This posting is provided "AS IS" with no warranties, and confers no rights.
Get Secure! www.microsoft.com/security
(This posting is provided "AS IS", with no warranties, and confers no
rights.)|||Hey Chris,
How are you doing on this issue, does the further suggestion help you some?
If there is still anything we can help, please feel free to post here.
Regards,
Steven Cheng
Microsoft MSDN Online Support Lead
==================================================
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
==================================================
This posting is provided "AS IS" with no warranties, and confers no rights.
Get Secure! www.microsoft.com/security
(This posting is provided "AS IS", with no warranties, and confers no
rights.)
Wednesday, March 28, 2012
Packege success, despite task failure
When I run my package, a task will fail, however, the package will claim that it was successful. Why is this, and how can I trigger a failed package when one task fails?
Thanks in advance.
Look at your package execution properties. You will see the "Fail package on Failure" option. The package claiming that it is successful is odd. You might want to also look at the "Maximum Error Count" property (again at the package level) to make sure you haven't set it higher than 1. Look at the "Force Execution Result" property also to make sure you don't have it set to "successful".
Hope this helps.
|||
Where do I find 'Maximum Error Count' and 'Force Execution Result'?
Packege success, despite task failure
When I run my package, a task will fail, however, the package will claim that it was successful. Why is this, and how can I trigger a failed package when one task fails?
Thanks in advance.
Look at your package execution properties. You will see the "Fail package on Failure" option. The package claiming that it is successful is odd. You might want to also look at the "Maximum Error Count" property (again at the package level) to make sure you haven't set it higher than 1. Look at the "Force Execution Result" property also to make sure you don't have it set to "successful".
Hope this helps.
|||
Where do I find 'Maximum Error Count' and 'Force Execution Result'?
Packages, Jobs and Scheduling issues
I'm starting to have second thoughts about switching to SQL 2005.
After hours of trying, I finally managed to get SSIS packages to run as intended. Now, I need to create a schedule for the package. I create a job, owned by the Local System account, which SQL Server Agent also uses. I can run the Package myself, but when I schedule it, it always fails. After much trial and error, I decided to give up on this manner of running SSIS packages. I decided to create the job as a CmdExec type job instead. When run manually, it woks fine, but when I schedule it, it doesn't run. Now, it's not that it gives an error message, it just DOESN'T run, it appears as "not scheduled" in the Job Activity Monitor, even though the schedule is listed in the Schedules property. How then, can I schedule the command to run? Thanks for the info.
The reason is very simple the local systems account does not have access to network resources anything you run through that account fails. xp_cmdshell is a better choice but you have to create a proxy to admin level account to run the Agent. All you need is covered below post again if you still need help, I know it works and can run for hours like clock work.
http://support.microsoft.com/kb/912911
|||I usually run the Sql Server Agent under a service account that has the privilges required to run any processing.
You may run into another issue where errors in Analysis Services Commands are not reported when run via the Sql Server Agent. I am in the process of trying to find out if MS is aware of this issue. It appears to be an issue in all versions of SQL 2005 up to and including SP2.
sqlPackages, Jobs and Scheduling issues
I'm starting to have second thoughts about switching to SQL 2005.
After hours of trying, I finally managed to get SSIS packages to run as intended. Now, I need to create a schedule for the package. I create a job, owned by the Local System account, which SQL Server Agent also uses. I can run the Package myself, but when I schedule it, it always fails. After much trial and error, I decided to give up on this manner of running SSIS packages. I decided to create the job as a CmdExec type job instead. When run manually, it woks fine, but when I schedule it, it doesn't run. Now, it's not that it gives an error message, it just DOESN'T run, it appears as "not scheduled" in the Job Activity Monitor, even though the schedule is listed in the Schedules property. How then, can I schedule the command to run? Thanks for the info.
The reason is very simple the local systems account does not have access to network resources anything you run through that account fails. xp_cmdshell is a better choice but you have to create a proxy to admin level account to run the Agent. All you need is covered below post again if you still need help, I know it works and can run for hours like clock work.
http://support.microsoft.com/kb/912911
|||
I usually run the Sql Server Agent under a service account that has the privilges required to run any processing.
You may run into another issue where errors in Analysis Services Commands are not reported when run via the Sql Server Agent. I am in the process of trying to find out if MS is aware of this issue. It appears to be an issue in all versions of SQL 2005 up to and including SP2.
Packages run faster on BIDS and slower on SQL Server
I am surprised to see that in Business Inteligence Studio debugging mode, my packages take shorter. And when I run it through Agent on the SQL Server where data actually resides, it take around 70% longer time.
The package is now very closed to data and database engine itself, in BIDS it wasnt.
Anybody knows why this happen ? Do I need to tune up something ?
Any input in this will sincerely be appreciated.When you're using SQL Agent, are you using the SSIS subsystem or the command subsystem (DTExec)? Do you have the SSIS service started ?
Packages run faster on BIDS and slower on SQL Server
I am surprised to see that in Business Inteligence Studio debugging mode, my packages take shorter. And when I run it through Agent on the SQL Server where data actually resides, it take around 70% longer time.
The package is now very closed to data and database engine itself, in BIDS it wasnt.
Anybody knows why this happen ? Do I need to tune up something ?
Any input in this will sincerely be appreciated.When you're using SQL Agent, are you using the SSIS subsystem or the command subsystem (DTExec)? Do you have the SSIS service started ?
Monday, March 26, 2012
Package works in BIDS, but not in SQL Server 2005
Code Snippet
Public Sub Main()
Dim variable1 As String = DirectCast(Dts.Variables("packagevariable1").Value, String)
Dim variable2 As String = DirectCast(Dts.Variables("packagevariable2").Value, String)
Dim variable3 As Integer = DirectCast(Dts.Variables("packagevariable3").Value, Integer)
Dim variable4 As String = DirectCast(Dts.Variables("packagevariable4").Value, String)
Dim filePath As String = DirectCast(Dts.Variables("filePath").Value, String)
Dim variable5 As String = DirectCast(Dts.Variables("packagevariable5").Value, String)
Dim results As Boolean
Dim fileGenerator As IProviderInterface
Dim intFactory As integrationServiceFactory = New ProviderIntegrationServiceFactory()
fileGenerator = intFactory.GetProviderEnrollmentGenerator(variable2, variable5)
results = fileGenerator.GenerateFile(variable3, variable1, filePath, variable2)
If results Then
Dts.TaskResult = Dts.Results.Success
Else
Dts.TaskResult = Dts.Results.Failure
End If
End Sub
Can you be more specific what you mean by "run in SQL Server"? How are these packages being executed?What other types of activities are failing in your packages? Is it always related to external code?
My thinking based on what you've given so far is that you're trying to load and execute packages from inside a CLR stored procedure. When you do that, you're using SQL's hosted CLR and are subject to SQL's security constraints, such as the Host Protection Attributes.
SQL CLR will deny execution of framework code that has these attributes.
ExternalProcessMgmt
ExternalThreading
MayLeakOnAbort
SecurityInfrastructure
SelfAffectingProcessMgmnt
SelfAffectingThreading
SharedState
Synchronization
UI
|||Primarily, I'm trying to execute this in a SQL job. However, if I execute it from Integration Services, the result is the same. My errors seem to be related to the external code, but I have 15 packages that all use some external code. Other activities that are failing also include decrypting files, but again, these are run in scripts that use external code.
|||
Dan Wheeler wrote:
Primarily, I'm trying to execute this in a SQL job. However, if I execute it from Integration Services, the result is the same. My errors seem to be related to the external code, but I have 15 packages that all use some external code.
"if I execute it from Integration Services, the result is the same". I don't know if that means BIDS or something else, and if "the same" means success or failure. I'm mostly going off your subject that the package works in BIDS, but not from a SQL Agent job.
The difference between executing it from BIDS and from a job is usually just the security context. Under BIDS, it inherits your permissions and under SQL Agent, it is whatever account the agent runs under, which is rarely the same. Anything going on in that code that might be affected by those different permissions? Accessing remote or secured drives or files, perhaps?
There is also another affect of the different user context that I don't see any evidence of here, but gets people all the time. That is that by default, SSIS encrypts sensitive data using a user key. When the package is executed by another user (i.e. SQL Agent), this sensitive data (passwords, mostly) are not available. See threads and documentation for "ProtectionLevel" for more information if you think that might be a factor.
|||
JayH wrote:
"if I execute it from Integration Services, the result is the same". I don't know if that means BIDS or something else, and if "the same" means success or failure. I'm mostly going off your subject that the package works in BIDS, but not from a SQL Agent job."
It means that they both fail to execute the package.
The only thing remote in this situation are the databases. This package is not failing with database connectivity issues. My security context is a limited user. What permissions would SQL Agent need?
The packages are password encrypted. I provide the password when creating the SQL job. When I execute the package in Integration Services, I provide the password. Therefore, I don't think it is a protection level issue.
|||
Dan Wheeler wrote:
JayH wrote:
"if I execute it from Integration Services, the result is the same". I don't know if that means BIDS or something else, and if "the same" means success or failure. I'm mostly going off your subject that the package works in BIDS, but not from a SQL Agent job."
It means that they both fail to execute the package.
The only thing remote in this situation are the databases. This package is not failing with database connectivity issues. My security context is a limited user. What permissions would SQL Agent need?
The packages are password encrypted. I provide the password when creating the SQL job. When I execute the package in Integration Services, I provide the password. Therefore, I don't think it is a protection level issue.
I'm still confused about whether the packages run in BIDS or not. If you can call your external code from a simple test harness, then it should work from BIDS.
The error indicates that either GetProviderEnrollmentGenerator is returning Nothing, or that the error is being generated from inside GenerateFile. Is it possible that the variables you're passing into either one of those are not what you think they are? Have you put MessageBoxes in the script to display those values? Being totally agnostic about your code, I'd suggest writing those variables out to a file or something where you can get them verbatim and then calling your external code from a test harness with those values.
|||As the subject states, these packages all execute without error in BIDS. In the first post, I note that I can run this exact code from a console app. I've already created a test harness for this purpose. Since I'm trying to execute this in Integration Services, will the MsgBox's still show up?|||
Dan Wheeler wrote:
As the subject states, these packages all execute without error in BIDS.
Quote "Dan Wheeler"
This code will even run from a command line console without error.
(as you can see, I've already created a test harness for this purpose) Since I'm trying to execute this in Integration Services, will the MsgBox's still show up?
Sorry, I think I'm confused about your use of "Integration Services". I couldn't tell if you meant BIDS or not. I also don't recall seeing that about your console app in the original post.
No, MessageBoxes will not show up if you execute from dtexec (i.e. the agent job). I was thinking about BIDS at that point.
|||Anyone else have anything to offer? I've created a deployment package and still no success.
|||
What if you try running a batch file from the agent job?
My experience mirrors Jay's on this: I think the error is security related. Does the the account that SQL Server or the agent is running under have access to all the files and folders used - not just by the package, but also by the .NET component?
sqlPackage wont run in a scheduled job
Hello,
I have put together a simple SSIS package that runs fine both via BIDS and via Mgt Studio.
The problem I have is that I can't get a scheduled job to run it. My scheduled job has only 1 step - and that step runs the package.
When I right click in Mgt Studio (SQL Agent) and choose "Start Job" I get this error :Executed as user: NMR044BRASQL886\SYSTEM. ....3033.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 14:10:56 Error: 2007-01-09 14:10:56.50 Code: 0xC0016016 Source: Description: Failed to decrypt protected XML node "DTS:Property" with error 0x8009000B "Key not valid for use in specified state.". You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that the correct key is available. End Error Error: 2007-01-09 14:10:57.04 Code: 0xC0202009 Source: B2B_Sales_Territory_Send_Back Connection manager "b2b_datamart" Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E4D. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E4D Description: "Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON'.". End Error Error: 2007-01-09 14:10:57.04 Code: 0xC020801C... The package execution fa... The step failed.
Can anyone help please?
I'm new to SSIS so clear+simple answers appreciated.
Thanks in advance
Mgale1
Check the ProtectionLevel proeprty of your package. My guess is that it is set to EncryptSensitiveWithUserKey. Is that correct?
This means that all passwords and such are encrypted with the username of the person that built the package hence when you run it as another user (which you are attempting to do here), it doesn't work. Read in BOL about the ProtectionLevel property to get a full understanding and then reply here with any problems.
-Jamie
|||
Hi Mgale1,
This could be due to the ProtectionLevel setting for the individual packages - that's my guess.
By default, these are set to EncryptSensitiveWithUserKey. This means as long as you personally execute the packages, your credentials are picked up and the packages execute in your security context. This is true even if you're connected to a remote machine, so long as you're using the same AD credentials you used when you built the packages. Does this make sense?
When the job you created executes, it runs under the SQL Agent Service logon credentials.
My understanding of the "Sensitive" in EncryptSensitiveWithUserKey is most likely flawed, but I cannot find a way to tell my SSIS package "hey, this isn't sensitive so don't encrypt it." Although this sometimes gets in the way I like this feature because it keeps me from doing something I would likely later regret. Anyway, my point is the Sensitive label is applied to connection strings and I cannot find a way to un-apply it (and I'm cool with that).
One of the first things an SSIS package tries to do (after validation) is load up configuration file data. This uses a connection, which (you guessed it) uses your first connection string. Since yours are encrypted with your own personal SID on the domain and this is different from the SID on the account running the SQL Agent Service, the job-executed packages cannot connect to the configuration files to decrypt them.
There are a couple proper long-term solutions but the one that makes the most sense is to use the EncryptSensitiveWithPassword Package ProtectionLevel option and supply a good strong password. You will need to supply the password when you set up the job step as well, and this should allow the package to run without needing your security credentials.
Note: You will also need this password to open the packages in BIDS (or Visual Studio) from now on... there's no free lunch.
Hope this helps,
Andy
|||You may want to check this KB article:
An SSIS package does not run when you call the SSIS package from a SQL Server Agent job step
http://support.microsoft.com/kb/918760
|||Admin,
This topic is a good candidate for a "sticky" for the forum.
This question is asked once a day... we (the forum) could come up with a synopsis on package encryption.
|||I always use ProtectionLevel=DontSaveSensitive. Without exception.
This actually forces you to use configurations - and that is no bad thing in my opinion.
-Jamie
Package won''t run from job
I imported my packages to SQL Server 2005.
Under Stored Packages > msdb, if I right click the package name an select "run package", the package runs fine without errors.
However if I put the same package in a job and run the job, the job fails every time and won't tell me why. It just says to check the history. This is what the history says:
Executed as user: myDomain\SQLService. The package execution failed. The step failed.
Seems like some kind of permission failure.
Any ideas, please. This permissions stuff is driving me nuts.
Here are all the permission levels that I am aware of when running a job:
1.) permissions at the package level (conn mgrs, ftp mgrs, etc)
2.) rely on SQL Server storage for roles and access control - what does even mean?
3.) job owner (sa) - how does this differ from the job step owner?
4.) job step > run as [user] (SQL Server Agent Account) - why does this require a separate owner?
5.) package > server > Log on to the server [as user] - what the heck is this?
It's maddening. For example, I have no idea what #5 means - when it (the package) is logging on to the server, WHO is it logging on as if the package is running in a job? The job owner? The job step owner? Or something entirely different?
I've asked other people here if they have any clue how all these permissions interrelate, and frankly no one knows. It's a big mystery..... sort of like the big bang. Don't ask me how I passed my MCDBA. That is another great mystery of the universe.
It's no wonder I can't get this thing to run!
Thank you, error handler, for sending me task-related error messages.
So here's the problem:
The script threw an exception: Could not find a part of the path 'o:\myFolder'.
In one of my packages, I have a script that gets file names from the network drive, o:
My other packages reference the d: drive with no problem. It's the o: drive that's having problems.
So maybe whoever is running the package doesn't have permissions to the o: drive, right?
So, getting back to my first post, who needs the permission to read the o: drive? Probably the step owner?
|||
See this classic KB that explains it:
http://support.microsoft.com/default.aspx/kb/918760
The access to o: should be granted to the user who runs the package - Agent service account or Proxy account if you use it for this job. In your case this is currently myDomain\SQLService (from "Executed as user..." message). Also, if o: is network drive, switch to using UNC path (\\server\path) as drive mapping is user-specific.
|||So it seems the problem has to do with the fact that the SQLService domain account does not have permissions to access o:, or that I need to use a UNC path.
However, I still have some questions about how all the pkg/job permissions interrelate to each other:
1.) permissions at the package level (ole db conn mgrs, ftp mgrs, etc) - the SQLService account does not have anything to do with this, right?
2.) rely on SQL Server storage for roles and access control - what does mean? does this relate to #1 above?
3.) job owner (sa) - how does this differ from the job step owner?
5.) New Job Step > General tab > SQL Server Integration Services Package > Server: Log on to the server [Windows Authentication] - will this be the same login as the job step owner, that is, the SQLServer account?
If anyone could answer any or all of these questions, and #5 especially, I would really appreciate it!
Thanks much
|||
sadie519590 wrote:
1.) permissions at the package level (ole db conn mgrs, ftp mgrs, etc) - the SQLService account does not have anything to do with this, right?
It depends - if your connection managers are using Windows Authentication, then the account SQL Agent is running under does have an impact.
sadie519590 wrote:
2.) rely on SQL Server storage for roles and access control - what does mean? does this relate to #1 above?
Without more context, I'm not sure. My best guess is that is referring to storing packages in SQL Server rather than the file system. If you do that, it can handle access to the packages.
sadie519590 wrote:
5.) New Job Step > General tab > SQL Server Integration Services Package > Server: Log on to the server [Windows Authentication] - will this be the same login as the job step owner, that is, the SQLServer account?
Yes, unless you have set up a proxy.
|||
1.) permissions at the package level (ole db conn mgrs, ftp mgrs, etc) - the SQLService account does not have anything to do with this, right?
It depends - if your connection managers are using Windows Authentication, then the account SQL Agent is running under does have an impact.
That's because the conn mgrs will be using the SQL Agent account, right? (Because the this is job step owner?)
Thanks|||Correct.|||o: is a mapped drive, correct? Mapped drives are a user-session concept. That is, when you log on, you can map drives. However, if I remote into that box and log on as well, I don't get to see your mapped drives. Service accounts, such as those that run SQL Server, SQL Server Agent, etc... don't have any clue what a mapped drive is. As soon as you log off of the box, the mapped drives go away. So yes, if you are going to use network shares, you HAVE to use UNC.|||
sadie519590 wrote:
So it seems the problem has to do with the fact that the SQLService domain account does not have permissions to access o:, or that I need to use a UNC path.
Yes, the unc path did solve the problem.
Thanks all.
Package won't run from job
I imported my packages to SQL Server 2005.
Under Stored Packages > msdb, if I right click the package name an select "run package", the package runs fine without errors.
However if I put the same package in a job and run the job, the job fails every time and won't tell me why. It just says to check the history. This is what the history says:
Executed as user: myDomain\SQLService. The package execution failed. The step failed.
Seems like some kind of permission failure.
Any ideas, please. This permissions stuff is driving me nuts.
Here are all the permission levels that I am aware of when running a job:
1.) permissions at the package level (conn mgrs, ftp mgrs, etc)
2.) rely on SQL Server storage for roles and access control - what does even mean?
3.) job owner (sa) - how does this differ from the job step owner?
4.) job step > run as [user] (SQL Server Agent Account) - why does this require a separate owner?
5.) package > server > Log on to the server [as user] - what the heck is this?
It's maddening. For example, I have no idea what #5 means - when it (the package) is logging on to the server, WHO is it logging on as if the package is running in a job? The job owner? The job step owner? Or something entirely different?
I've asked other people here if they have any clue how all these permissions interrelate, and frankly no one knows. It's a big mystery..... sort of like the big bang. Don't ask me how I passed my MCDBA. That is another great mystery of the universe.
It's no wonder I can't get this thing to run!
Thank you, error handler, for sending me task-related error messages.
So here's the problem:
The script threw an exception: Could not find a part of the path 'o:\myFolder'.
In one of my packages, I have a script that gets file names from the network drive, o:
My other packages reference the d: drive with no problem. It's the o: drive that's having problems.
So maybe whoever is running the package doesn't have permissions to the o: drive, right?
So, getting back to my first post, who needs the permission to read the o: drive? Probably the step owner?
|||
See this classic KB that explains it:
http://support.microsoft.com/default.aspx/kb/918760
The access to o: should be granted to the user who runs the package - Agent service account or Proxy account if you use it for this job. In your case this is currently myDomain\SQLService (from "Executed as user..." message). Also, if o: is network drive, switch to using UNC path (\\server\path) as drive mapping is user-specific.
|||So it seems the problem has to do with the fact that the SQLService domain account does not have permissions to access o:, or that I need to use a UNC path.
However, I still have some questions about how all the pkg/job permissions interrelate to each other:
1.) permissions at the package level (ole db conn mgrs, ftp mgrs, etc) - the SQLService account does not have anything to do with this, right?
2.) rely on SQL Server storage for roles and access control - what does mean? does this relate to #1 above?
3.) job owner (sa) - how does this differ from the job step owner?
5.) New Job Step > General tab > SQL Server Integration Services Package > Server: Log on to the server [Windows Authentication] - will this be the same login as the job step owner, that is, the SQLServer account?
If anyone could answer any or all of these questions, and #5 especially, I would really appreciate it!
Thanks much
|||
sadie519590 wrote:
1.) permissions at the package level (ole db conn mgrs, ftp mgrs, etc) - the SQLService account does not have anything to do with this, right?
It depends - if your connection managers are using Windows Authentication, then the account SQL Agent is running under does have an impact.
sadie519590 wrote:
2.) rely on SQL Server storage for roles and access control - what does mean? does this relate to #1 above?
Without more context, I'm not sure. My best guess is that is referring to storing packages in SQL Server rather than the file system. If you do that, it can handle access to the packages.
sadie519590 wrote:
5.) New Job Step > General tab > SQL Server Integration Services Package > Server: Log on to the server [Windows Authentication] - will this be the same login as the job step owner, that is, the SQLServer account?
Yes, unless you have set up a proxy.
|||
1.) permissions at the package level (ole db conn mgrs, ftp mgrs, etc) - the SQLService account does not have anything to do with this, right?
It depends - if your connection managers are using Windows Authentication, then the account SQL Agent is running under does have an impact.
That's because the conn mgrs will be using the SQL Agent account, right? (Because the this is job step owner?)
Thanks|||Correct.|||o: is a mapped drive, correct? Mapped drives are a user-session concept. That is, when you log on, you can map drives. However, if I remote into that box and log on as well, I don't get to see your mapped drives. Service accounts, such as those that run SQL Server, SQL Server Agent, etc... don't have any clue what a mapped drive is. As soon as you log off of the box, the mapped drives go away. So yes, if you are going to use network shares, you HAVE to use UNC.|||
sadie519590 wrote:
So it seems the problem has to do with the fact that the SQLService domain account does not have permissions to access o:, or that I need to use a UNC path.
Yes, the unc path did solve the problem.
Thanks all.
Package won't run correctly when called
*****************************************************************
Function Main()
On Error Resume Next
Dim Excel_Application
Dim Excel_WorkBook
Dim Excel_WorkSheet
Dim sFilename
Dim sSheetName
sFilename = "\\Share\LCS Database Information - Data Test.xls"
sSheetName = "Experience Spreadsheet"
Set Excel_Application = CreateObject("Excel.Application")
Excel_Application.DisplayAlerts = False
' Open the workbook specified
Set Excel_WorkBook = Excel_Application.Workbooks.Open(sFilename)
Excel_WorkBook.Worksheets(sSheetName).Activate
FOR x = 2 TO 6 STEP 1
FOR i = 1 TO 5 STEP 1
Excel_WorkBook.WorkSheets(sSheetName).Cells(i,x).Value = 1
NEXT
NEXT
*************************************************************************
It does this to 3 sheets and then the import steps run.
Now, when I execute the package in Enterprise Manager it works exactly as I want. However when I run it through the stored procedure, the ActiveX doesn't run properly. My stored proc looks essentiall like this:
****************************************************************************
--Creat TEMP table for the DTS package to store values from the spreadsheet
EXEC pCreateTEMPTables
IF @.@.ERROR <> 0
BEGIN
SET @.Segment = 1
GOTO sqlerror
END
--Run the DTS package that will pull data into temp tables so the individual --data points can be used to search up keys
EXEC msdb.dbo.sp_start_job
@.job_name = 'LCSRunDTS'
EXEC more.procs
****************************************************************************
One thing about the spreadsheets is that they have links in them wanring messages get thrown up during the ActiveX script about updating the links. I tried to turn them off but one message, "One or more links can't be updated" which comes up if you click "Update" on the previous warning or supress warnings, just won't go away.
Any ideas? Thanks
Well, I got the first problem above figured out. When I ran the package directly from Enterprise Manager, the ActiveX object was created on my local system and used Excel there to do what it needed to. When I tried to run the package on the server which didn't have Excel installed, my script failed. My new solution (as soon as it works) is to use WshRemote to run the script to alter the spreadsheets using a computer with Excel installed. The following script works when run from my PC but fails with the message "Error Source = Microsoft VBScript runtime error, Error Description: Permission Denied" when run as part of the DTS package called from a stored procedure via dtsrun:
--
Dim Controller, RemoteScript
Set Controller = CreateObject("WSHController")
Set RemoteScript = Controller.CreateScript("\\Server\Share\ExcelAdd1s.vbs", "MyLocalPC")
RemoteScript.Execute
Being new to the DBA world, I need some help with permissions. Is it that the SQL Server account doesn't have access to the shared drive, or to my PC? Where can I look to see what user is running the DTS package? Are there any questions that I'm not asking that I should be? Any help would be appreciated.|||When yuo run the package from EM directly your logged in account needs to have permission to the share. When it runs from an agent job the SQL Agent account must have permission to do the same thing, it must be to a UNC share and not a mapped drive
package with a custom component (log provider) runs in BIDS, but doesn't run where deployed.
Hi,
I have a package with a custom log provider, which runs in BIDS. However when I deploy the package onto SQL Server and run it on the deployed machine, if fails:
"failed to decrypt protected XML node DTS:Password...key not valid for use in specified state..."
Now this is definately to do with the custom log, as if I take it out & redeploy, I can run it on the deplyed server + also run it within a job. I have entered the custom log provider library (+ other required DLLs) in the GAC on the deployed machine, but I'm clearly missing something.
Any ideas pls? I'm really stuck.
Many thanks in advance,
Tamim.
Please search this forum...
You'll need to set the package protection level to "DontSaveSensitive" and then issue a password via the /SET command line switch. OR, you could try to promote to the server using SQL Server users/role level security, maybe....|||
Thanks for the quick reply Phil, but that didn't work.
Is there anything extra I need to do/bear in mind with respect to the signed assemblies that comprise the custom log provider?
|||Tamim Sadikali wrote:
Thanks for the quick reply Phil, but that didn't work.
Is there anything extra I need to do/bear in mind with respect to the signed assemblies that comprise the custom log provider?
What do you mean that didn't work? That error message states that you are using indicates to me that the protection level is set to EncryptSensitiveWithUserKey, which means that ONLY the user who built the package can execute it.
Package variables "disappear" from job properties
Creating a SQL Agent job to run a package works just fine. On the Step Properties, Set values tab, I can add variables and the values I want for them, no problem. The job runs just fine. If I want to go edit the values however, they're always gone.
Is there some security setting that hides the variables and their values which is on by default?
GregsListAcct wrote:
Creating a SQL Agent job to run a package works just fine. On the Step Properties, Set values tab, I can add variables and the values I want for them, no problem. The job runs just fine. If I want to go edit the values however, they're always gone.
Is there some security setting that hides the variables and their values which is on by default?
Greg,
I don't know the answer to this but regardless, you should stop using the SSIS sub-system and just call teh package usinng the command-line sub-system instead.
There is an article at wiki.sqlis.com that explains the rationale behind this but the site is down so i can't link to it. Basically you don't get any diagnostic info out of the SSIS sub-system.
-Jamie
Friday, March 23, 2012
Package scheduling and error trace-
I am scheduling the package to run on the nightly basis everyday using windows scheduled task.
How do i check whether the package ran successfully or it has given any error.
Can anyone please suggest me if I can log the error in some table of SQL server.
PLease suggest the steps.
When building a package in BIDS, have a look at the logging options. See the SSIS menu item, Logging.
You can log to several locations including a table.
I would expect DTEXEC to send the appropriate return code when it failed, so the scheduled job should fail. Unfortunately the windows scheduler is not very advanced, so you cannot capture output or configure the return code. The lack of output and monitoring means I would not use it.
Package Runs in 1 minutes in Visual Studio but takes 5+ (sometimes hanges) as Job
I have an SSIS package that when run from Visual Studio takes 1 minute or less to complete. When I schedule this package to run as a SQL Server job it takes 5+ and sometimes hangs complaining about buffers.
The server is a 4 way 3ghz Xeon (dual core) with 8GB ram and this was the only package running.
When I look in the log I see that the package is running and processing data, although very very very very very slowly.
Has anyone else experienced this?
Thanks for your help
Marcus
You need to identify what the differences are between the two executions. It could be network latency, utilisation of the box by other processes, the user that you're running it as....anything.
What happens when you execute the package on the Xeon using dtexec.exe rather than thru SQL Server Agent. Do you get the same?
-Jamie
|||
I don't klnow what is going on with my system. The above package has ran find since I posted this message. I'm now having issues with stored procedure that is being ran from a job.
I can run the SP from Management Studio and it completes. The CPU does spike above 90% but it completes.
When I execute the same SP from job I get the following error and ALL connections to the database are killed.
Msg 64, Sev 16, State 1: TCP Provider: The specified network name is no longer available. [SQLSTATE 08S01]
Msg 64, Sev 16, State 1: Communication link failure [SQLSTATE 08S01]
Msg 10004, Sev 16, State 1: Communication link failure [SQLSTATE 08S01]
I have taken the SQL out of the SP and tried to execute it directly from within the job and it still fails wth the same issue.
I have no idea what is going on or where to start looking.
Here is a copy of the SP that I'm trying to run from a job.... Help... I'm glad it's a Friday
CREATE PROCEDURE [dbo].[sproc_LoadData_FactMessageOpen] AS
SET NOCOUNT ON
TRUNCATE TABLE [dbo].[FactMessageOpen]
INSERT INTO [dbo].[FactMessageOpen] (
[DateID],[TimeID],[MailingID],[ProfileID],[MessageFormatID],[TransactionDateTime])
SELECT
(SELECT DateID FROM dbo.DimDate WHERE DateAK = CAST(CONVERT(varchar(10),X.TransactionDateTime,101) AS smalldatetime)) AS DateID,
(SELECT TimeID FROM DimTime WHERE Hour12 = LTRIM(SUBSTRING(CONVERT(varchar(50),X.TransactionDateTime,100),CHARINDEX(':',CONVERT(varchar(50),X.TransactionDateTime,100))-2,2)) AND Minute = SUBSTRING(CONVERT(varchar(50),X.TransactionDateTime,100),CHARINDEX(':',CONVERT(varchar(50),X.TransactionDateTime,100))+1,2) AND NoonFlag = SUBSTRING(CONVERT(varchar(50),X.TransactionDateTime,100),CHARINDEX(':',CONVERT(varchar(50),X.TransactionDateTime,100))+3,2)) AS TimeID,
(SELECT MailingID FROM dbo.DimMailing WHERE MailingAK = X.MailingAK) AS MailingID,
(SELECT ProfileID FROM dbo.DimProfile WHERE ProfileAK = X.ProfileAK) AS ProfileID,
(SELECT MessageFormatID FROM dbo.DimMessageFormat WHERE MessageFormatAK IS NULL) AS MessageFormatID,
X.TransactionDateTime
FROM
(
SELECT
ER.TEMPLATE_ID AS MailingAK,
ER.EMAIL_ADDR_ID AS ProfileAK,
MIN(ER.ACTION_DT) AS TransactionDateTime
FROM
TEST.MDB.EMAIL_RESPONSE ER
JOIN
TEST.MDB.ACTION_SUB_ACTION_LOOKUP AL
ON
ER.ACT_SUB_ID = AL.ACT_SUB_ID
WHERE
ACTION_CD = 'G'
GROUP BY
ER.TEMPLATE_ID,
ER.EMAIL_ADDR_ID
) AS X
ORDER BY
DateID,
TimeID
Hi
I actually ran the query through SQL Profiler and it didn't come back with any recommendations. I alos looked at the explain plan and every node is using an index.
does the query look poorley written?
thanks
Package runs fine in Debug.. fails outside of debug
I created a package that runs fine while in debug... but when I run outside of debug (just choosing Debug.. Start without debugging) I get an error of:
Message: The task "Formulate SQL Query and Destination" cannot run on this edition of Integration Services. It requires a higher level edition.
I am on SP1 on my workstation and the only SSIS server I have. I am not doing anything fancy in my script.. I have created other packages and ran outside of debug mode without issue. I did uninstall SQL2005 and SSIS from my workstation as it was eating up too many resources.
The script I am running is (removed some of the SQL etc to shorten it):
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports System.IO
Public Class ScriptMain
Public Sub Main()
' Create the SQL
Dim SQL As String
SQL += "SELECT "
SQL += " -- Lots of stuff"
SQL += "FROM "SQL += CStr(Dts.Variables("gvSourceTableName").Value) & " "
SQL += "WHERE "
SQL += " stuff = '" & CStr(Dts.Variables("gvCurrentSymbol").Value) & "' "
SQL += "ORDER BY date"
Dts.Variables("lvSQLQuery").Value = SQL
' MsgBox(CStr(Dts.Variables("lvSQLQuery").Value))
' Create the filename
Dim Dir As String
Dim FileName As String
Dim FullFileName As String
Dir = CStr(Dts.Variables("gvDestinationRoot").Value) & "\" & CStr(Dts.Variables("gvRunID").Value) & "\"
FileName = CStr(Dts.Variables("gvCurrentSymbol").Value) & "--.txt"
FullFileName = Dir & FileName
If Not Directory.Exists(Dir) Then
Directory.CreateDirectory(Dir)
End If
Dts.Variables("gvDestinationFile").Value = FullFileName
' MsgBox(CStr(Dts.Variables("gvDestinationFile").Value))
Dts.TaskResult = Dts.Results.Success
End Sub
End Class
FYI.. tried reinstalling 2005 SP1 on my workstation... still receiving the error message.|||This looks familiar. Try going to the SQL Server 2005 installer and install Integration Services on your local machine. The messages I've received that were similiar to this were solved by that. I believe you have the designer installed on your machine, but not the integration services engine. Good luck.|||Thanks, when I uninstalled SQL 2005 I must have took the SSIS stuff with it.. reinstalled SSIS server components and patch and works great now.|||Chris Honcoop wrote:
Thanks, when I uninstalled SQL 2005 I must have took the SSIS stuff with it.. reinstalled SSIS server components and patch and works great now.
You're welcome. You know, you'd think that a better error message could be raised... maybe something that actually says you don't have SSIS installed... oh well.
Package running other packages
Hi everybody,
I have to create a package that executes other packages I've already created... Every one of these packages run with the same Configuration File, but if I try to execute the main one, including the path of the file, I get errors from the other packages because they can't find it... How can I manage to pass this file and its content to the other packages?
Here a little explanation of te process:
Main Package needs configuration file X.dtsConfig, and calls: package1, which needs configuration file X.dtsConfig; package2, .......
I hope everything is clear...
I have completely replaced all these packages using config files and calling other packages.
It it is supposed to work with SP1 though. I did not try sp1 yet.
In the meantime, I have consolidated all these "sub-packages" functionality into the main packages.
Philippe
|||teone wrote:
Hi everybody,
I have to create a package that executes other packages I've already created... Every one of these packages run with the same Configuration File, but if I try to execute the main one, including the path of the file, I get errors from the other packages because they can't find it... How can I manage to pass this file and its content to the other packages?
Here a little explanation of te process:
Main Package needs configuration file X.dtsConfig, and calls: package1, which needs configuration file X.dtsConfig; package2, .......
I hope everything is clear...
You can pass values from the parent package through to child packages using parent package configurations.
-Jamie
|||but what I need is to pass the whole configuration file, not only some variable of the package.... how can I do that?|||Why not just reference the same config file in each package?
This is made alot easier by the use of indirect configurations: http://blogs.conchango.com/jamiethomson/archive/2005/11/02/2342.aspx
-Jamie
|||I am stupid, and I need further explanations....
I created a new environment variable, and I set its value with the path of my configuration file... is this the right thing to do?
After this I have enabled package configuration on the package, and set the source to that variable, but it doesn't work... Why?
Thanks
|||Restart the machine so the sys variable is recognized.|||teone wrote:
I am stupid, and I need further explanations....
I created a new environment variable, and I set its value with the path of my configuration file... is this the right thing to do?
Yes.
teone wrote:
After this I have enabled package configuration on the package, and set the source to that variable, but it doesn't work... Why?
Thanks
Without being there its a bit hard to say. What "doesn't work"? What behaviour are you expecting? What happens instead?
-Jamie
|||The people I work for decided I don't need to do this package anymore.... so I quit thinking about it.Thanks anyway!
|||
Bonus!!
I wish my superiors would say that to me sometime! :)
Package randomly stops
I have a very weird issue in my latest package. I run it and it just randomly stops. When I watch it in debug mode or not in debugging a command prompt window will flash for an instant (too fast to read) and then the package will stop. It stops inside of a for each loop and the "on completion" is never fired for the loop. I never receive any errors - its just like someone hit the stop button. Sometimes it will process hundreds of files before stopping, other times only a few. (And its not stopping on the same file each time.. it doesn't appear to be related to the file at all)
Any ideas what could be going on? How to troubleshoot?
The process might have crashed (the console window is SqlDumper collecting the crash information). Check if C:\Program Files\Microsoft SQL Server\90\Shared\ErrorDumps\ contains any dumps.Also see this thread
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=541734&SiteID=1
Make sure you install the SP1, and if the problem still occurs, contact product support services.|||
Yes that folder contains files from the times it stopped - what can I read them with?
That thread mentions turning on error reporting - do you do that on the server or my dev workstation? (and where do I find that setting?)
Another problem I have (maybe related?) is that about once a day visual studio will just "disappear" while I am working on a package .. (or sometimes I am not even working on it) however there are no files for those occurances.
Yeah, I've been using SP1 for quite some time.
Thanks
Chris
Chris Honcoop wrote:
Yes that folder contains files from the times it stopped - what can I read them with?
That thread mentions turning on error reporting - do you do that on the server or my dev workstation? (and where do I find that setting?)
The files are minidumps that can be read by debuggers, like Visual Studio. I would not recommend you to try to read them - they are meant for product devepers, not for users (unless you've built a custom task or transform and suspect it might be involved in the crash).
Use SQL Server Error and Usage Reporting tool to enable "Send error reports..." option - this will allow development team and product support to analize the problem and likely provide a fix in next SP. If you need solution sooner, contact Product Support Services.
|||The only custom objects I have installed is the RegEx one available from microsoft download - however I am not using it in this package (as it wouldn't work unless I was running the package from my workstation). I did uninstall it - however its still in my list of available objects if I go into "choose items..."|||Well I ran the package directly on the server and it crashed as well so it doesn't appear to be something specific to my machine.
Overview of the dataflow component that fails is I have a flat file that has data in the header I need to keep and tack on each row. (see my thread on cross join) as well as one set of meta data that determines which rows in the file I need to keep. So I take the file and split it four ways - one of the split is basically all data except the metadata. I join one set of metadata to the main data to retrieve the rows of data I need. Then I also add a column to each datastream and hardcode it to 1 and join all the metadata back to the main datastream (since there is no cross join function).
Funny thing is I have been watching what it looks like when it crashes and the file and the split always work but the sorting before joining back into the merges or the merges themselves is where it seems to die (if indeed the graphical view at the time of crash is correct).
How do I know if it is sending the crash info to msft? Its crashed 5 times since turning that on and only once did it bring up a box saying visual studio encountered an error.
|||Tried some more things as well... still random crashes: I've remade the package from scratch, restructured the dataflows 4 or 5 different way.. still random "dissappearing act" and memory dump to that location.
|||Please contact product support services - they'll be able to analize the crash dumps and provide a solution for the problem, or temporary workaround until the full solution is developed.It is hardly possible to analize crashes via forum.
Wednesday, March 21, 2012
Package from Package Sore does not work as Job
Dear All,
I try to run a Package which is placed in the Package Store via a job - but this does not work.
Some things about the package:
- The package evokes some warnings, because I do not use all columns from the datasource
- Package runs in BI Studio - log is written
- Package runs if startet from the Package Store - log is written
- Package Execution stopps immediately if started via job - without writting in the log
As hint - needless to say :-) - that the log provider is configured in all three spaces...
Does anybody now, if I could use the configured log provider from the package directly in the jobs _without_ doing the settings?
Other jobs - without the warnings - are running and placing there step results in the log.
Thanks in advice!
Cheers
Markus
Some info that may help-
Scheduled Packages
(http://wiki.sqlis.com/default.aspx/SQLISWiki/ScheduledPackages.html)
Darren,
sorry for this late reply - you are right!
The user did not have appropriate rights on the package store folder.
cheers,
markus