Hi everyone,
I'm quite a newbie to SQL and DTS and I need your help!
HTe situation:
I've got an Access97 frontend with an SQL7 backend. On the SQL-server I created a job (to execute a DTS package) which is triggered by a custom-error. From the Access-side I raise this error to start excuting the DTS. I do that this way: execute master..xp_logevent 50001, LOG
This works fine, the job runs and everything, however I need to know when the job has ended. The code that is after this call in Access97 needs to be sure that the DTS-package has finished executing.
How do I do that, please bear in mind that I'm quite a newbie so simple language please :)
TIA, RaymondMay be the last step in DTS populates a field in some table, and your Access 97 front end keeps checking that table to see if that field has been updated or not.
Showing posts with label dts. Show all posts
Showing posts with label dts. Show all posts
Monday, March 19, 2012
Check for a file before executing DTS
Thanks in advance for any help offered!
I am having a text file sent from another location on a daily basis. Suppposely @. 2:00 am. I created a DTS to bring the data into a local table (5:00AM). The data I am bringing in replaces the old data. So what happens in the DTS package is all data is dropped from the table and then the new data is inserted.
I then have another job run that runs later in the day before the next incoming data arrives (7:00PM). This job deletes the old text file so that it is not appended the next time the text file is sent to me.
The problem is that for the last two days, the server sending the text file did not send the files before my local jobs run (up to 9:00AM and 7:00AM). Thus my 7:00PM job has deleted the old text file. Then later my job that calls the DTS runs (5:00AM); it then drops all data and then tries to load new data that is not there because my 7:00PM job deleted the text file and the their 2:00AM has not delevered the new text file.
My question is; what is the best way to script a job that checks to see if the text file exist before dropping the existing table?
I know that the whole process could be handled better if both the supplier of the data and me the end user could be more flexible. The problem is that the sender is going to send the data in the manner (time and method) that requires the least amount of work for them. I just need to deal with it.
Thanks,
LeeWhat type of text file ? How are you processing it now in the your dts script ? What steps are currently being used in your dts script ?|||hi
I had the same problem with a daily based data import DTS I created. Here's the way I solved it:
Open your DTS in design view. Add an ActiveX component, written in vb script:
'************************************************* *********************
' Visual Basic ActiveX Script
'************************************************* ***********************
Function Main()
Set MyFile = CreateObject("Scripting.FileSystemObject")
If MyFile.FileExists(local_server_path) Then
Main = DTSTaskExecResult_Success
Else
Main = DTSTaskExecResult_Failure
End If
End Function
___________________________________________
___________________________________________
'local_server_path' is your file path on the server.
Use the 'onSuccess' event to start the rest of your DTS.
Hope it helps.
I am having a text file sent from another location on a daily basis. Suppposely @. 2:00 am. I created a DTS to bring the data into a local table (5:00AM). The data I am bringing in replaces the old data. So what happens in the DTS package is all data is dropped from the table and then the new data is inserted.
I then have another job run that runs later in the day before the next incoming data arrives (7:00PM). This job deletes the old text file so that it is not appended the next time the text file is sent to me.
The problem is that for the last two days, the server sending the text file did not send the files before my local jobs run (up to 9:00AM and 7:00AM). Thus my 7:00PM job has deleted the old text file. Then later my job that calls the DTS runs (5:00AM); it then drops all data and then tries to load new data that is not there because my 7:00PM job deleted the text file and the their 2:00AM has not delevered the new text file.
My question is; what is the best way to script a job that checks to see if the text file exist before dropping the existing table?
I know that the whole process could be handled better if both the supplier of the data and me the end user could be more flexible. The problem is that the sender is going to send the data in the manner (time and method) that requires the least amount of work for them. I just need to deal with it.
Thanks,
LeeWhat type of text file ? How are you processing it now in the your dts script ? What steps are currently being used in your dts script ?|||hi
I had the same problem with a daily based data import DTS I created. Here's the way I solved it:
Open your DTS in design view. Add an ActiveX component, written in vb script:
'************************************************* *********************
' Visual Basic ActiveX Script
'************************************************* ***********************
Function Main()
Set MyFile = CreateObject("Scripting.FileSystemObject")
If MyFile.FileExists(local_server_path) Then
Main = DTSTaskExecResult_Success
Else
Main = DTSTaskExecResult_Failure
End If
End Function
___________________________________________
___________________________________________
'local_server_path' is your file path on the server.
Use the 'onSuccess' event to start the rest of your DTS.
Hope it helps.
Check File size before importing
I have a DTS procedure that runs 4 times a day. It has worked well until
now. It is reading infromation from a text file. Today the text file was
empty and DTS failed giving me this error; "The volume for a file has been
externally altered so that the opened file is no longer valid.". I believe
this is happening because the file size is 0 K bytes (This file is
constantly overwriitten). Is there a way before importing a text file to
check and verify that the file is not empty?
Use FSO (File System Object) to check the size first in an ActiveX script,
you can find an exampl on sqldts.com
"deheinz1" <deheinz1@.discussions.microsoft.com> wrote in message
news:5292E0B2-B1B5-4B3F-A51C-45E66D8F0C53@.microsoft.com...
>I have a DTS procedure that runs 4 times a day. It has worked well until
> now. It is reading infromation from a text file. Today the text file was
> empty and DTS failed giving me this error; "The volume for a file has been
> externally altered so that the opened file is no longer valid.". I
> believe
> this is happening because the file size is 0 K bytes (This file is
> constantly overwriitten). Is there a way before importing a text file to
> check and verify that the file is not empty?
|||Even confirming that the file size is > 0 may not indicate reliably that the
source data transfer has completed. I have a similar situation where files
are being FTP'd from the mainframe and can take 1/2 or more to complete
(assuming they do complete). My solution was to ask the data processing
staff to download the file with an extention of .TMP and then rename it to
..DAT after the download is complete.
"deheinz1" <deheinz1@.discussions.microsoft.com> wrote in message
news:5292E0B2-B1B5-4B3F-A51C-45E66D8F0C53@.microsoft.com...
> I have a DTS procedure that runs 4 times a day. It has worked well until
> now. It is reading infromation from a text file. Today the text file was
> empty and DTS failed giving me this error; "The volume for a file has been
> externally altered so that the opened file is no longer valid.". I
believe
> this is happening because the file size is 0 K bytes (This file is
> constantly overwriitten). Is there a way before importing a text file to
> check and verify that the file is not empty?
now. It is reading infromation from a text file. Today the text file was
empty and DTS failed giving me this error; "The volume for a file has been
externally altered so that the opened file is no longer valid.". I believe
this is happening because the file size is 0 K bytes (This file is
constantly overwriitten). Is there a way before importing a text file to
check and verify that the file is not empty?
Use FSO (File System Object) to check the size first in an ActiveX script,
you can find an exampl on sqldts.com
"deheinz1" <deheinz1@.discussions.microsoft.com> wrote in message
news:5292E0B2-B1B5-4B3F-A51C-45E66D8F0C53@.microsoft.com...
>I have a DTS procedure that runs 4 times a day. It has worked well until
> now. It is reading infromation from a text file. Today the text file was
> empty and DTS failed giving me this error; "The volume for a file has been
> externally altered so that the opened file is no longer valid.". I
> believe
> this is happening because the file size is 0 K bytes (This file is
> constantly overwriitten). Is there a way before importing a text file to
> check and verify that the file is not empty?
|||Even confirming that the file size is > 0 may not indicate reliably that the
source data transfer has completed. I have a similar situation where files
are being FTP'd from the mainframe and can take 1/2 or more to complete
(assuming they do complete). My solution was to ask the data processing
staff to download the file with an extention of .TMP and then rename it to
..DAT after the download is complete.
"deheinz1" <deheinz1@.discussions.microsoft.com> wrote in message
news:5292E0B2-B1B5-4B3F-A51C-45E66D8F0C53@.microsoft.com...
> I have a DTS procedure that runs 4 times a day. It has worked well until
> now. It is reading infromation from a text file. Today the text file was
> empty and DTS failed giving me this error; "The volume for a file has been
> externally altered so that the opened file is no longer valid.". I
believe
> this is happening because the file size is 0 K bytes (This file is
> constantly overwriitten). Is there a way before importing a text file to
> check and verify that the file is not empty?
Check File size before importing
I have a DTS procedure that runs 4 times a day. It has worked well until
now. It is reading infromation from a text file. Today the text file was
empty and DTS failed giving me this error; "The volume for a file has been
externally altered so that the opened file is no longer valid.". I believe
this is happening because the file size is 0 K bytes (This file is
constantly overwriitten). Is there a way before importing a text file to
check and verify that the file is not empty?Use FSO (File System Object) to check the size first in an ActiveX script,
you can find an exampl on sqldts.com
"deheinz1" <deheinz1@.discussions.microsoft.com> wrote in message
news:5292E0B2-B1B5-4B3F-A51C-45E66D8F0C53@.microsoft.com...
>I have a DTS procedure that runs 4 times a day. It has worked well until
> now. It is reading infromation from a text file. Today the text file was
> empty and DTS failed giving me this error; "The volume for a file has been
> externally altered so that the opened file is no longer valid.". I
> believe
> this is happening because the file size is 0 K bytes (This file is
> constantly overwriitten). Is there a way before importing a text file to
> check and verify that the file is not empty?|||Even confirming that the file size is > 0 may not indicate reliably that the
source data transfer has completed. I have a similar situation where files
are being FTP'd from the mainframe and can take 1/2 or more to complete
(assuming they do complete). My solution was to ask the data processing
staff to download the file with an extention of .TMP and then rename it to
.DAT after the download is complete.
"deheinz1" <deheinz1@.discussions.microsoft.com> wrote in message
news:5292E0B2-B1B5-4B3F-A51C-45E66D8F0C53@.microsoft.com...
> I have a DTS procedure that runs 4 times a day. It has worked well until
> now. It is reading infromation from a text file. Today the text file was
> empty and DTS failed giving me this error; "The volume for a file has been
> externally altered so that the opened file is no longer valid.". I
believe
> this is happening because the file size is 0 K bytes (This file is
> constantly overwriitten). Is there a way before importing a text file to
> check and verify that the file is not empty?
now. It is reading infromation from a text file. Today the text file was
empty and DTS failed giving me this error; "The volume for a file has been
externally altered so that the opened file is no longer valid.". I believe
this is happening because the file size is 0 K bytes (This file is
constantly overwriitten). Is there a way before importing a text file to
check and verify that the file is not empty?Use FSO (File System Object) to check the size first in an ActiveX script,
you can find an exampl on sqldts.com
"deheinz1" <deheinz1@.discussions.microsoft.com> wrote in message
news:5292E0B2-B1B5-4B3F-A51C-45E66D8F0C53@.microsoft.com...
>I have a DTS procedure that runs 4 times a day. It has worked well until
> now. It is reading infromation from a text file. Today the text file was
> empty and DTS failed giving me this error; "The volume for a file has been
> externally altered so that the opened file is no longer valid.". I
> believe
> this is happening because the file size is 0 K bytes (This file is
> constantly overwriitten). Is there a way before importing a text file to
> check and verify that the file is not empty?|||Even confirming that the file size is > 0 may not indicate reliably that the
source data transfer has completed. I have a similar situation where files
are being FTP'd from the mainframe and can take 1/2 or more to complete
(assuming they do complete). My solution was to ask the data processing
staff to download the file with an extention of .TMP and then rename it to
.DAT after the download is complete.
"deheinz1" <deheinz1@.discussions.microsoft.com> wrote in message
news:5292E0B2-B1B5-4B3F-A51C-45E66D8F0C53@.microsoft.com...
> I have a DTS procedure that runs 4 times a day. It has worked well until
> now. It is reading infromation from a text file. Today the text file was
> empty and DTS failed giving me this error; "The volume for a file has been
> externally altered so that the opened file is no longer valid.". I
believe
> this is happening because the file size is 0 K bytes (This file is
> constantly overwriitten). Is there a way before importing a text file to
> check and verify that the file is not empty?
Thursday, March 8, 2012
Check Constraint and DTS Problem
Hi,
When using DTS, is it possible trap errors caused by check-constraints?
For example, I have a check constraint in the first column and another in the second column. If my data does not comply to the rules set in the first column, is there a way for me to check whether my data satisfy the second columns condition?
Reason being that I would like to validate all my columns and log all the errors of each field ( if any ). But using check-constraints, DTS would only test until the column that has error, and then throws and exception without checking the rest.
If it is not possible, some expert advice on this issue would be greatly appreciated. :)
Also, is there a way for me to know which column and row DTS is processing?
Thanks!
regards,
Tuantry to anticipate rather than actuate
I mean, try to check first if rows you are going to insert conflict with any constraint and log them accurate
When using DTS, is it possible trap errors caused by check-constraints?
For example, I have a check constraint in the first column and another in the second column. If my data does not comply to the rules set in the first column, is there a way for me to check whether my data satisfy the second columns condition?
Reason being that I would like to validate all my columns and log all the errors of each field ( if any ). But using check-constraints, DTS would only test until the column that has error, and then throws and exception without checking the rest.
If it is not possible, some expert advice on this issue would be greatly appreciated. :)
Also, is there a way for me to know which column and row DTS is processing?
Thanks!
regards,
Tuantry to anticipate rather than actuate
I mean, try to check first if rows you are going to insert conflict with any constraint and log them accurate
Sunday, February 12, 2012
CHAOS
How come there is a Chaos isolation level in the Advanced tab in DTS package
properties? Was someone having a bad day and thought there life was in chao
s?Hi, Andre
Books Online says (in the "DTS Package Properties (Advanced Tab)"
topic):
Transaction isolation level
Select from one of these levels:
Chaos. You can see uncommitted changes made by other transactions,
but update locks are not held to the end of the transaction.
Rollback is not supported. This isolation level is not supported
by SQL Server.
Read Committed. You cannot see changes made by other transactions
until those transactions are committed.
Read Uncommitted. You can see uncommitted changes made by other
transactions.
Repeatable Read. You are guaranteed not to see any changes made by
other transactions in values it has already read.
Serializable. This option guarantees that all concurrent
transactions will interact only in ways that produce the same effect
as if each transaction were executed entirely one after the other.
Razvan|||It is an unsupported isolation level. In general, with this isolation level,
a transaction can view the uncommitted changes made by other transactions.
There is no real control over the update locks -- they may or may not be
held till the end of transaction. Failures can be fatal, will not preseve
ACID properties and there are no rollbacks.
Anith|||It is used by the SQL engine itself. Why it made it on some GUI tab is
anyone's guess.
The only way I've even seen it changed was with fiddling with the PSS struct
while under a debugger.
"Andre" <Andre@.discussions.microsoft.com> wrote in message
news:FC959C18-4C04-41A9-B49F-BEA33BD8836A@.microsoft.com...
> How come there is a Chaos isolation level in the Advanced tab in DTS
> package
> properties? Was someone having a bad day and thought there life was in
> chaos?
properties? Was someone having a bad day and thought there life was in chao
s?Hi, Andre
Books Online says (in the "DTS Package Properties (Advanced Tab)"
topic):
Transaction isolation level
Select from one of these levels:
Chaos. You can see uncommitted changes made by other transactions,
but update locks are not held to the end of the transaction.
Rollback is not supported. This isolation level is not supported
by SQL Server.
Read Committed. You cannot see changes made by other transactions
until those transactions are committed.
Read Uncommitted. You can see uncommitted changes made by other
transactions.
Repeatable Read. You are guaranteed not to see any changes made by
other transactions in values it has already read.
Serializable. This option guarantees that all concurrent
transactions will interact only in ways that produce the same effect
as if each transaction were executed entirely one after the other.
Razvan|||It is an unsupported isolation level. In general, with this isolation level,
a transaction can view the uncommitted changes made by other transactions.
There is no real control over the update locks -- they may or may not be
held till the end of transaction. Failures can be fatal, will not preseve
ACID properties and there are no rollbacks.
Anith|||It is used by the SQL engine itself. Why it made it on some GUI tab is
anyone's guess.
The only way I've even seen it changed was with fiddling with the PSS struct
while under a debugger.
"Andre" <Andre@.discussions.microsoft.com> wrote in message
news:FC959C18-4C04-41A9-B49F-BEA33BD8836A@.microsoft.com...
> How come there is a Chaos isolation level in the Advanced tab in DTS
> package
> properties? Was someone having a bad day and thought there life was in
> chaos?
Subscribe to:
Posts (Atom)