Showing posts with label single. Show all posts
Showing posts with label single. Show all posts

Wednesday, March 28, 2012

PackageStart/End Events

Just finishing of a large ETL system and have aquestion about the following:-

We have 164 child packages being called from a single parent package which is setup to perform logging to a SQL Server table. Anything that Errors or has Warnings is logged accordingly, however, how do you trap the PackageStart and PackageEnd events? when the child package knows nothing about logging?

My first thought was to have the Parent call the AddEvent SP with the appropriate values, but thought I may check here in case I missed something

In a parent/child package structure I place all my logging in one place - in the parent package. All events (including child package OnPreExecute/OnPostExecute events) "bubble-up" to the parent package where they are "trapped" by the log provider.

-Jamie

|||Thats exactly what I'm doing, I get everything but the start/end events.|||

Do you mean OnPreExecute & OnPostExecute?

I can't imagine why you're not getting them. it works for me!

-Jamie

|||The parent package adds entries to SYSDTSLOG90 for PackageStart and PackageEnd plus any other log entries (Warnings, Errors etc) from either the Parent or the Child, but excluding the PackageStart and PackageEnd entries for the Child package. Its those events I would like to get logged, now we have no warnings ot errors the log looks it bit thin with just Parnet PackageStart and then 4-5 hours later the Parent PackagEnd|||

Then I'm stumped!

The parent logs its 'PackageStart' , it the executes the 'Execute Package' Task a number of times all packages execute normally and bubble up any Errors or Warnings to the parent. The only thing that appears not to bubble up (and consequently not logged to sysdtslog90) are the Child PackageStart and PackageEnd events.

In the parent I have configured the 'Execute Package' Task so: - DisableEventHandlers=False, and EventHandlers for OnError, OnPostExecute, OnPreExecute, OnTaskFailed and OnWarning, each of these has no Tasks but merely serves as a place holder in order to set Propogate=False.

It's not causing any problem to our applicaion other than the fact that I am trying to write a UI that links our application maintained log with sysdtslog90

Any pointers gratefully accepted

Paul

PackageStart/End Events

Just finishing of a large ETL system and have aquestion about the following:-

We have 164 child packages being called from a single parent package which is setup to perform logging to a SQL Server table. Anything that Errors or has Warnings is logged accordingly, however, how do you trap the PackageStart and PackageEnd events? when the child package knows nothing about logging?

My first thought was to have the Parent call the AddEvent SP with the appropriate values, but thought I may check here in case I missed something

In a parent/child package structure I place all my logging in one place - in the parent package. All events (including child package OnPreExecute/OnPostExecute events) "bubble-up" to the parent package where they are "trapped" by the log provider.

-Jamie

|||Thats exactly what I'm doing, I get everything but the start/end events.|||

Do you mean OnPreExecute & OnPostExecute?

I can't imagine why you're not getting them. it works for me!

-Jamie

|||The parent package adds entries to SYSDTSLOG90 for PackageStart and PackageEnd plus any other log entries (Warnings, Errors etc) from either the Parent or the Child, but excluding the PackageStart and PackageEnd entries for the Child package. Its those events I would like to get logged, now we have no warnings ot errors the log looks it bit thin with just Parnet PackageStart and then 4-5 hours later the Parent PackagEnd|||

Then I'm stumped!

The parent logs its 'PackageStart' , it the executes the 'Execute Package' Task a number of times all packages execute normally and bubble up any Errors or Warnings to the parent. The only thing that appears not to bubble up (and consequently not logged to sysdtslog90) are the Child PackageStart and PackageEnd events.

In the parent I have configured the 'Execute Package' Task so: - DisableEventHandlers=False, and EventHandlers for OnError, OnPostExecute, OnPreExecute, OnTaskFailed and OnWarning, each of these has no Tasks but merely serves as a place holder in order to set Propogate=False.

It's not causing any problem to our applicaion other than the fact that I am trying to write a UI that links our application maintained log with sysdtslog90

Any pointers gratefully accepted

Paul

Wednesday, March 21, 2012

Package fails but single Task ends with success

Hi

I've created a simple package that contains only one task that is an execute sql task. When I run only this single task from Business Intelligence development studio it runs successfully. But when I run the whole package (also from Business intlligence studio), the package fails.

The data source I access is ODBC. I'm sure the real reason for the error is the bad ODBC driver of the data source but this can't be changed. So I need to know what is different from running only a task in a package to running the whole package. If I knew that I might be able to adjust some setting and make it work.

Any help welcome.

What is the error you are receiving?

Rafael Salas

|||I had a similar probem the other day.

all the individual steps would run ok in visual studio, but the package as a whole would fail.

the problem was a spurious executable that was not visible on the control flow design screen, but was listed as an executable in package explorer.

its was this spurious executable that was failing, and so causing my package to fail.

have a look to see what is shown as an executable in the package explorer tab?|||

Hi Rafael

Thanks for your answer.

I get the messagebox-error with "Unable to load DLL, Fatal Error!" in the messagebox title and the dll's path in the message body. The dll that is indicated there belongs to the system that should be accessed.

Christian

|||

I've checked the package in package explorer but there's only the one executable that I've just created.

Christian

sql

Package fails "Cannot use a CONTAINS or FREETEXT predicate"

I have a database (SQL Server 2000 Enterprise Ed.) with a single table
(Products), which has 3 columns set for Full-Text-Indexing. This
database is working fine. All queries are done through Stored Procs,
and works like a charm.
I tried to copy the database to another database (SQL Server 2000
Personal Ed.) which is on my PC. I have created a DTS package on the
Originating server. This DTS package used to work fine for a long
time, until sometime back few weeks ago I introduced FTI, and now it
does not work.
I get this error when executing DTS package:
"[ODBC SQL Server Driver][SQL Server]Cannot use a CONTAINS or FREETEXT
predicate on table 'Products' because it is not full-text indexed."
This table is full text on the originating and also on the destination
database. (I created one on the destination, thought perhaps that's why
it wouldn't work)
Does anyone know how to fix this? Thanks.
Is your DTS package dropping the table each time? It should merely delete
the data.
Hilary Cotter
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
<bikmann@.gmail.com> wrote in message
news:1108690034.091208.194630@.g14g2000cwa.googlegr oups.com...
> I have a database (SQL Server 2000 Enterprise Ed.) with a single table
> (Products), which has 3 columns set for Full-Text-Indexing. This
> database is working fine. All queries are done through Stored Procs,
> and works like a charm.
> I tried to copy the database to another database (SQL Server 2000
> Personal Ed.) which is on my PC. I have created a DTS package on the
> Originating server. This DTS package used to work fine for a long
> time, until sometime back few weeks ago I introduced FTI, and now it
> does not work.
> I get this error when executing DTS package:
> "[ODBC SQL Server Driver][SQL Server]Cannot use a CONTAINS or FREETEXT
> predicate on table 'Products' because it is not full-text indexed."
> This table is full text on the originating and also on the destination
> database. (I created one on the destination, thought perhaps that's why
> it wouldn't work)
> Does anyone know how to fix this? Thanks.
>
|||DTS package is No dropping the table, and I do remember a while back
seeing data in the Products table. But I will check this again to make
sure and get back.
BTW, I couldn't get to nwsu.com's SQL Server replication book, perhaps
it is blocked on the corporate proxy.
|||It is also available on Amazon.
Hilary Cotter
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
<bikmann@.gmail.com> wrote in message
news:1108749013.221494.201010@.g14g2000cwa.googlegr oups.com...
> DTS package is No dropping the table, and I do remember a while back
> seeing data in the Products table. But I will check this again to make
> sure and get back.
> BTW, I couldn't get to nwsu.com's SQL Server replication book, perhaps
> it is blocked on the corporate proxy.
>
|||Alright, I checked. The DTS is NOT dropping the table in the
destination nor originating. And it is also properly copying the
contents of the Products table into the destination database. But yet
it package will still fail with that message.
|||How are you transferring the data?
Are you using the Transform Data task or the Transfer Objects task?
If the latter then what options do you have set?
Allan Mitchell MCSE,MCDBA, (Microsoft SQL Server MVP)
www.SQLDTS.com - The site for all your DTS needs.
www.SQLIS.com - SQL Server 2005 Integration Services.
www.Konesans.com
<bikmann@.gmail.com> wrote in message news:1108690034.091208.194630@.g14g2000cwa.googlegr oups.com...
>I have a database (SQL Server 2000 Enterprise Ed.) with a single table
> (Products), which has 3 columns set for Full-Text-Indexing. This
> database is working fine. All queries are done through Stored Procs,
> and works like a charm.
> I tried to copy the database to another database (SQL Server 2000
> Personal Ed.) which is on my PC. I have created a DTS package on the
> Originating server. This DTS package used to work fine for a long
> time, until sometime back few weeks ago I introduced FTI, and now it
> does not work.
> I get this error when executing DTS package:
> "[ODBC SQL Server Driver][SQL Server]Cannot use a CONTAINS or FREETEXT
> predicate on table 'Products' because it is not full-text indexed."
> This table is full text on the originating and also on the destination
> database. (I created one on the destination, thought perhaps that's why
> it wouldn't work)
> Does anyone know how to fix this? Thanks.
>
|||Allan,
I am using "Copy SQL Server Objects Task" to copy the database, with
all the defaults settings.
|||Personally If I was copying a database I would use BACKUP/RESTORE. I am
not a great fan of the Copy Objects task.
Allan
"Bik" <bikmann@.gmail.com> wrote in message news:bikmann@.gmail.com:
> Allan,
> I am using "Copy SQL Server Objects Task" to copy the database, with
> all the defaults settings.
|||DTS allows unattended copy to another running SQL Server. I have used
BACKUP/RESORE, but i wanted to use DTS so that I can put it on
schedule. Currently I have 3 other DBs being copied via DTS, this is
the only one that causes problem., yet it would still copy the table
contents.
|||You can set up a job to do the BACKUP and restore also unattended.
Allan
"Bik" <bikmann@.gmail.com> wrote in message news:bikmann@.gmail.com:
> DTS allows unattended copy to another running SQL Server. I have used
> BACKUP/RESORE, but i wanted to use DTS so that I can put it on
> schedule. Currently I have 3 other DBs being copied via DTS, this is
> the only one that causes problem., yet it would still copy the table
> contents.

Monday, February 20, 2012

Overlapping of PDF

I have few reports that I want to get into a single PDF. I am clubbing all of
them as subreports in a single parent report. Then I convert that report to
PDF using SOAP API.
I would like to mention some points :
- I have properly placed rectangles (with 0 inch height and Page Break At
End property set to true) as pagebreaks.
- Also the combination of reports is such that some will fit in a Protrait
while others in Landscape.
The problems I am facing :
- On publishing them to the server, they just look fine. When I try to
render that to PDF, I dont know why they just overlap somewhere.
- Another problem is that the quality of the PDF. It seems to get
distorted(though the printout seems decent). This will definitely be a
problem if the user prefers an online copy of the report.
Can you help me out?Would it be possible for you to send me the rdls, rdl.data file, and the PDF
directly?
Without more information I don't think I can recommend anything to you.
--
Bruce Johnson [MSFT]
Microsoft SQL Server Reporting Services
This posting is provided "AS IS" with no warranties, and confers no rights.
"Kam" <Kam@.discussions.microsoft.com> wrote in message
news:CBDF9219-D306-412E-9DB4-36783ADCBD9A@.microsoft.com...
> I have few reports that I want to get into a single PDF. I am clubbing all
of
> them as subreports in a single parent report. Then I convert that report
to
> PDF using SOAP API.
> I would like to mention some points :
> - I have properly placed rectangles (with 0 inch height and Page Break At
> End property set to true) as pagebreaks.
> - Also the combination of reports is such that some will fit in a
Protrait
> while others in Landscape.
> The problems I am facing :
> - On publishing them to the server, they just look fine. When I try to
> render that to PDF, I dont know why they just overlap somewhere.
> - Another problem is that the quality of the PDF. It seems to get
> distorted(though the printout seems decent). This will definitely be a
> problem if the user prefers an online copy of the report.
> Can you help me out?

Over-allocating space for a database, impact on performance

SQL Server 2005:
Assume I have 4 GB of data in a single database. I expect to have 15 GB
after two years. I have a 300 GB logical drive. Is there any harm in
setting the default size of the database to 20 GB? This avoids
fragmentation as new space is allocated to the drive, but creates a "big
shell". Is there a performance hit by sizing it big initially? Data will
be on RAID 5, Logs on RAID 1, and OS on RAID 1.
Thanks,
Mark
Other than the cost of creating the initial file, or restoring a backup,
(both of which are negligible if you have instant file initialization on), I
can't think of anything. (Backups will only see pages with actual data on
them, so it won't affect those.) It's good to size the file appropriately,
if you have the space... ideally, you will NEVER have an unplanned autogrow
event. Things can always change, e.g. after six months you might revise
your two-year estimate and want to grow the file again at some point. But
it is much better to plan this growth for a planned maintenance window or
low volume period, as opposed to letting the file grow in the middle of a
busy day... users will be unhappy.
"Mark" <mark@.idonotlikespam.com> wrote in message
news:OJAsB%239mIHA.1368@.TK2MSFTNGP02.phx.gbl...
> SQL Server 2005:
> Assume I have 4 GB of data in a single database. I expect to have 15 GB
> after two years. I have a 300 GB logical drive. Is there any harm in
> setting the default size of the database to 20 GB? This avoids
> fragmentation as new space is allocated to the drive, but creates a "big
> shell". Is there a performance hit by sizing it big initially? Data
> will be on RAID 5, Logs on RAID 1, and OS on RAID 1.
> Thanks,
> Mark
>
|||It sounds like you are taking a set-it-up -and-forget-it approach to space
management. I'd argue that you should always monitor your database space
usage. Leave enough free space so that you don't run into space shortage
unexpectedly. But just don't let over-allocation give you a false sense of
security.
One potential downside of over-allocation is in case you need to deattach,
copy, and attach the database, you'd need to move a lot more data than
otherwise you would.
Linchi
"Mark" wrote:

> SQL Server 2005:
> Assume I have 4 GB of data in a single database. I expect to have 15 GB
> after two years. I have a 300 GB logical drive. Is there any harm in
> setting the default size of the database to 20 GB? This avoids
> fragmentation as new space is allocated to the drive, but creates a "big
> shell". Is there a performance hit by sizing it big initially? Data will
> be on RAID 5, Logs on RAID 1, and OS on RAID 1.
> Thanks,
> Mark
>
>

Over-allocating space for a database, impact on performance

SQL Server 2005:
Assume I have 4 GB of data in a single database. I expect to have 15 GB
after two years. I have a 300 GB logical drive. Is there any harm in
setting the default size of the database to 20 GB? This avoids
fragmentation as new space is allocated to the drive, but creates a "big
shell". Is there a performance hit by sizing it big initially? Data will
be on RAID 5, Logs on RAID 1, and OS on RAID 1.
Thanks,
MarkOther than the cost of creating the initial file, or restoring a backup,
(both of which are negligible if you have instant file initialization on), I
can't think of anything. (Backups will only see pages with actual data on
them, so it won't affect those.) It's good to size the file appropriately,
if you have the space... ideally, you will NEVER have an unplanned autogrow
event. Things can always change, e.g. after six months you might revise
your two-year estimate and want to grow the file again at some point. But
it is much better to plan this growth for a planned maintenance window or
low volume period, as opposed to letting the file grow in the middle of a
busy day... users will be unhappy.
"Mark" <mark@.idonotlikespam.com> wrote in message
news:OJAsB%239mIHA.1368@.TK2MSFTNGP02.phx.gbl...
> SQL Server 2005:
> Assume I have 4 GB of data in a single database. I expect to have 15 GB
> after two years. I have a 300 GB logical drive. Is there any harm in
> setting the default size of the database to 20 GB? This avoids
> fragmentation as new space is allocated to the drive, but creates a "big
> shell". Is there a performance hit by sizing it big initially? Data
> will be on RAID 5, Logs on RAID 1, and OS on RAID 1.
> Thanks,
> Mark
>|||It sounds like you are taking a set-it-up -and-forget-it approach to space
management. I'd argue that you should always monitor your database space
usage. Leave enough free space so that you don't run into space shortage
unexpectedly. But just don't let over-allocation give you a false sense of
security.
One potential downside of over-allocation is in case you need to deattach,
copy, and attach the database, you'd need to move a lot more data than
otherwise you would.
Linchi
"Mark" wrote:
> SQL Server 2005:
> Assume I have 4 GB of data in a single database. I expect to have 15 GB
> after two years. I have a 300 GB logical drive. Is there any harm in
> setting the default size of the database to 20 GB? This avoids
> fragmentation as new space is allocated to the drive, but creates a "big
> shell". Is there a performance hit by sizing it big initially? Data will
> be on RAID 5, Logs on RAID 1, and OS on RAID 1.
> Thanks,
> Mark
>
>