Thursday, March 22, 2012
Check if files exist Stored Proc
This stored proc code uses DOS Copy command and xp_cmdshell stored
procedure to copy files form one location to another. See below the
code. It is working fine!. What I need is to add a check if the files in
the Source location exist or not. If not, then send an email (using
xp_sendmail) to us saying "files do not exist". If yes, then start
copying the files. How do I do that?
Thanks for your help.
declare @.source varchar(150)
declare @.destination varchar(150)
declare @.DOScmd varchar(300)
select @.source =
case @.@.servername
when 'A' then '\\server1\folder1\a*.*'
when 'B' then '\\server2\folder2\b*.*'
when 'C' then '\\server3\folder3\c*.*'
end,
@.destination =
case @.@.servername
when 'A' then '\\serverx\folderx'
when 'B' then '\\serverx\folderx'
when 'C' then '\\serverx\folderx'
end
-- copy only if the files in the source folder exist, otherwise send an
email for "files do not exixts".
set @.DOScmd = 'Copy ' + '"'+ rtrim(@.source) +'"'+ ' ' +'"'+
rtrim(@.destination)+'"'
exec master.dbo.xp_cmdshell @.DOScmd
GO
*** Sent via Developersdex http://www.examnotes.net ***Look up xp_fileexist in Books Online.
"Test Test" <farooqhs_2000@.yahoo.com> wrote in message
news:%23CYCQDguFHA.3236@.TK2MSFTNGP14.phx.gbl...
> Hi,
> This stored proc code uses DOS Copy command and xp_cmdshell stored
> procedure to copy files form one location to another. See below the
> code. It is working fine!. What I need is to add a check if the files in
> the Source location exist or not. If not, then send an email (using
> xp_sendmail) to us saying "files do not exist". If yes, then start
> copying the files. How do I do that?
> Thanks for your help.
> declare @.source varchar(150)
> declare @.destination varchar(150)
> declare @.DOScmd varchar(300)
> select @.source =
> case @.@.servername
> when 'A' then '\\server1\folder1\a*.*'
> when 'B' then '\\server2\folder2\b*.*'
> when 'C' then '\\server3\folder3\c*.*'
> end,
> @.destination =
> case @.@.servername
> when 'A' then '\\serverx\folderx'
> when 'B' then '\\serverx\folderx'
> when 'C' then '\\serverx\folderx'
> end
> -- copy only if the files in the source folder exist, otherwise send an
> email for "files do not exixts".
> set @.DOScmd = 'Copy ' + '"'+ rtrim(@.source) +'"'+ ' ' +'"'+
> rtrim(@.destination)+'"'
> exec master.dbo.xp_cmdshell @.DOScmd
> GO
>
>
> *** Sent via Developersdex http://www.examnotes.net ***|||Whoops, never mind! I forgot this proc is not documented / supported.
"Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in message
news:uwg8xEguFHA.2568@.TK2MSFTNGP15.phx.gbl...
> Look up xp_fileexist in Books Online.|||A particular file can be checked with:
EXEC master..xp_fileexist 'c:\boot.ini'
HTH, Jens Suessmeyer.|||If this were implemented as a DTS package, you would perhaps find the file
system object more suitable for copying files, etc.
http://msdn.microsoft.com/library/d...ystemObject.asp
"Test Test" <farooqhs_2000@.yahoo.com> wrote in message
news:%23CYCQDguFHA.3236@.TK2MSFTNGP14.phx.gbl...
> Hi,
> This stored proc code uses DOS Copy command and xp_cmdshell stored
> procedure to copy files form one location to another. See below the
> code. It is working fine!. What I need is to add a check if the files in
> the Source location exist or not. If not, then send an email (using
> xp_sendmail) to us saying "files do not exist". If yes, then start
> copying the files. How do I do that?
> Thanks for your help.
> declare @.source varchar(150)
> declare @.destination varchar(150)
> declare @.DOScmd varchar(300)
> select @.source =
> case @.@.servername
> when 'A' then '\\server1\folder1\a*.*'
> when 'B' then '\\server2\folder2\b*.*'
> when 'C' then '\\server3\folder3\c*.*'
> end,
> @.destination =
> case @.@.servername
> when 'A' then '\\serverx\folderx'
> when 'B' then '\\serverx\folderx'
> when 'C' then '\\serverx\folderx'
> end
> -- copy only if the files in the source folder exist, otherwise send an
> email for "files do not exixts".
> set @.DOScmd = 'Copy ' + '"'+ rtrim(@.source) +'"'+ ' ' +'"'+
> rtrim(@.destination)+'"'
> exec master.dbo.xp_cmdshell @.DOScmd
> GO
>
>
> *** Sent via Developersdex http://www.examnotes.net ***|||Thanks for the respone. The "xp_fileexist" works great if you know the
exact file name. In my case, I dont know the file name. All I need is to
capture the files start with "a" i.e. a*.*
So, EXEC master..xp_fileexist '\\server1\folder1\a*.*' does not work.
Any other ideas?
*** Sent via Developersdex http://www.examnotes.net ***|||CREATE TABLE #files
(
filename SYSNAME NULL
)
SET NOCOUNT ON
INSERT #files EXEC master..xp_cmdshell 'dir \\server1\folder1\a*.* /b'
SELECT COUNT(*) FROM #files WHERE filename IS NOT NULL
SELECT * FROM #files WHERE filename IS NOT NULL
DROP TABLE #files
"Test Test" <farooqhs_2000@.yahoo.com> wrote in message
news:uvoSyaguFHA.1032@.TK2MSFTNGP12.phx.gbl...
> Thanks for the respone. The "xp_fileexist" works great if you know the
> exact file name. In my case, I dont know the file name. All I need is to
> capture the files start with "a" i.e. a*.*
> So, EXEC master..xp_fileexist '\\server1\folder1\a*.*' does not work.
> Any other ideas?
>
>
> *** Sent via Developersdex http://www.examnotes.net ***|||If you're already planning on using xp_cmdshell, then try this:
declare @.cmd varchar(1024)
declare @.path varchar(1024)
create table #fs
(
fId int identity (1, 1) primary key
,fName varchar(1024)
)
set @.path = 'c:\Inetpub*.*'
set @.cmd = 'dir /b ' + @.path
insert #fs
(
fName
)
exec master.dbo.xp_cmdshell @.cmd
if (exists (
select fs.fId
from #fs fs
where (fs.fName is not null)
))
begin
print 'exists'
end
else
begin
print 'does not exist'
end
ML|||Aaron! Thanx a lot!!!. It works like a champ!!!
*** Sent via Developersdex http://www.examnotes.net ***
Monday, March 19, 2012
check file date and copy file
Hi,
I need to set up create a package so that I could check the date of the files posted in a folder, e.g. H:\source. If there is no file created later than one day exists, then continue to check again one hour later. If files do exists, then copy then to c:\dest and then upzip the files. Once this is done, sent an notification email to user@.mydomain.com.
Thanks,
Check out the FileWatcher task on SQLIS.com. It should help with identifying when the file appears. The rest of the tasks mentioned here are included with SSIS. You can use the File System task to copy files, and the Execute Process task to run a commandline utility to unzip them. The Send Mail task is used to send emails.|||Hi,
I installed the program in the sqlis.com, but when I open the ssis business intelligent console, I can't find the filewatcher task in the toolbox. Can you tell me how to add this task in?
Thanks,
|||There are instructions on SQLIS.com.
"The component is provided as an MSI file, however to complete the installation, you will have to add the task to the Visual Studio toolbox manually. Right-click the toolbox, and select Choose Items.... Select the SSIS Control Flow Items tab, and then check the File Watcher Task from the list."
Sunday, March 11, 2012
Check data before insert
I need help thinking about this problem :-)
I have an SSIS pkg that automatically downloads financial extracts from an ftp site. Once the files are downloaded, I load the extract to a table. The table is first deleted before the insert, so that each time the table has "fresh" data (whatever was in the extract for that day).
Once the extract data is in the table, I load the data into yet another table that combines data from many tables. Simple enough.
Problem is, sometimes when I download the extract, it hasn't been updated yet, so I'm downloading an OLD extract. This old data then gets loaded into the first table. That's ok, because it doesn't really hurt anything. I can always delete the table and reload it if necessary.
The problem occurs when the old data goes from this table into the OTHER table. We don't want old data in this other table!
I need a way to check that I'm not loading the same data 2 days in a row into the OTHER table.
I realize that I might be able to solve this problem without using SSIS, but the solutions I've come up with so far aren't 100% satisfactory. I can use a query to check dates and that sort of thing, but it isn't foolproof, and would create problems if I need to manually force the process though, that is, if I need to override the date logic.
Anyways, I'm wondering if there's an SSIS approach to this problem... I can't rely on timestamps on the data files either. They're not accurate.
This is has been very perplexing to me.
Thanks
You have to determine a way to programatically decide if the data is "new" or not. Without that, I don't know how we can solve anything for you.|||
Yes, I have that.
Once I load the table, I check a "date entered" field against the data's timestamp column.
That is, if I load the data on 8/15, the data timestamp should be 8/14, or the day before.
If it's 8/13, it's out of date.
But what happens if the data load on 8/14 (with a data timestamp of 8/13) didn't run... and I need to re-load the data from 8/14 on 8/15?
Now, date entered is 8/15, but the data has a data timestamp of 8/13, which in this case IS correct, but my logic will reject the data because it isn't from the day before.
That's the problem, as I see it. If I hardcode logic into my stored proc, then I can't easily override it if I need to.
I was thinking maybe, instead, I could put the date logic in a separate SQL function, then call this function from an Execute SQL task in SSIS. Then have the function return a 0 (failure) or 1 (success). If it returns a 0, then do not perform any more steps in the package. If it returns a 1, then finish running the steps in the package.
Is this possible to do? If so, that could solve the problem.
Thanks
|||My idea being, if I need to override the logic, I can simply disable the Execute SQL task that contains the function. Then run the package manually.|||You could store the last successful date loaded and then read that in at the beginning of your package and then compare it to your data coming in. If the data coming in is greater than that date, then process it.|||I would suggest the above mention of saving the successful date. Alternately, if you have a decent audit trail, you can use the date and time of the last run and do a comparison of the files timestamp
'
' Add your code here
'
Dim vars As Variables
Dim var As Variable
Dts.VariableDispenser.LockForRead("FileName")
Dts.VariableDispenser.GetVariables(vars)
Dim path As String = vars.Item(0).Value.ToString
If File.Exists(path) = False Then
' File does not exist, no attributes to read
Else
vars = Nothing
Dts.VariableDispenser.LockForWrite("FileModifiedDate")
Dts.VariableDispenser.LockForWrite("FileCreationDate")
Dts.VariableDispenser.GetVariables(vars)
vars("FileModifiedDate").Value = File.GetLastWriteTime(path)
vars("FileCreationDate").Value = File.GetCreationTime(path)
vars.Unlock()
End If
Dts.TaskResult = Dts.Results.Success
|||I'm not sure if I've made my problem very clear :-)
As I mentioned in a previous post, I already have a way to determine if data is out of date or not on any given day.
The problem is that if I want to be able to override this logic if I need to. Please see my previous post that clearly explains what the problem is.
Q.) I am wondering if I can call a function in an Execute SQL task to have it return a value to the package. Depending on the value returned, I want to halt execution of the package.
Thanks
|||Never mind... I realize my logic won't work anyways.
I just need to check the timestamps in the file names.........
Wednesday, March 7, 2012
Check all CPU's are used
In order to check that Sql is using both CPU's, I'm going
to load 2 bcp files simultaneously. I'll check the task
manager to monitor that CPU usage increases for both CPU's.
Is there a better way to show my sysadmin that both CPU's
are being used ? He refuses to believe that SQL will
recognize both CPU's without a reconfiguration.
TIA,
JackIn Enterprise Manager, right-click the server name, select properties and
click the processor tab. Check the CPUs you want to use. You may need to
stop and restart the services...can't remember.
Kevin Hill
President
3NF Consulting
www.3nf-inc.com/NewsGroups.htm
"Jack A" <anonymous@.discussions.microsoft.com> wrote in message
news:64f501c3e69d$d53d1570$a301280a@.phx.gbl...
quote:|||Hi,
> We added a new CPU to our 1 CPU box making it a 2 CPU box.
> In order to check that Sql is using both CPU's, I'm going
> to load 2 bcp files simultaneously. I'll check the task
> manager to monitor that CPU usage increases for both CPU's.
> Is there a better way to show my sysadmin that both CPU's
> are being used ? He refuses to believe that SQL will
> recognize both CPU's without a reconfiguration.
> TIA,
> Jack
No need to do a configuration change to use the second CPU.
How to check is.
1. Use the Performance monitor from Control panel ...Admin tools
2. From the conter, u have to select Processor and select the 2 CPUs
seperately.
3. Now you run your SQL server batch processes.
This will give you a picture of CPU usage.
Thanks
Hari
MCDBA
"Jack A" <anonymous@.discussions.microsoft.com> wrote in message
news:64f501c3e69d$d53d1570$a301280a@.phx.gbl...
quote:
> We added a new CPU to our 1 CPU box making it a 2 CPU box.
> In order to check that Sql is using both CPU's, I'm going
> to load 2 bcp files simultaneously. I'll check the task
> manager to monitor that CPU usage increases for both CPU's.
> Is there a better way to show my sysadmin that both CPU's
> are being used ? He refuses to believe that SQL will
> recognize both CPU's without a reconfiguration.
> TIA,
> Jack
Check all CPU's are used
In order to check that Sql is using both CPU's, I'm going
to load 2 bcp files simultaneously. I'll check the task
manager to monitor that CPU usage increases for both CPU's.
Is there a better way to show my sysadmin that both CPU's
are being used ? He refuses to believe that SQL will
recognize both CPU's without a reconfiguration.
TIA,
JackIn Enterprise Manager, right-click the server name, select properties and
click the processor tab. Check the CPUs you want to use. You may need to
stop and restart the services...can't remember.
--
Kevin Hill
President
3NF Consulting
www.3nf-inc.com/NewsGroups.htm
"Jack A" <anonymous@.discussions.microsoft.com> wrote in message
news:64f501c3e69d$d53d1570$a301280a@.phx.gbl...
> We added a new CPU to our 1 CPU box making it a 2 CPU box.
> In order to check that Sql is using both CPU's, I'm going
> to load 2 bcp files simultaneously. I'll check the task
> manager to monitor that CPU usage increases for both CPU's.
> Is there a better way to show my sysadmin that both CPU's
> are being used ? He refuses to believe that SQL will
> recognize both CPU's without a reconfiguration.
> TIA,
> Jack|||Hi,
No need to do a configuration change to use the second CPU.
How to check is.
1. Use the Performance monitor from Control panel ...Admin tools
2. From the conter, u have to select Processor and select the 2 CPUs
seperately.
3. Now you run your SQL server batch processes.
This will give you a picture of CPU usage.
Thanks
Hari
MCDBA
"Jack A" <anonymous@.discussions.microsoft.com> wrote in message
news:64f501c3e69d$d53d1570$a301280a@.phx.gbl...
> We added a new CPU to our 1 CPU box making it a 2 CPU box.
> In order to check that Sql is using both CPU's, I'm going
> to load 2 bcp files simultaneously. I'll check the task
> manager to monitor that CPU usage increases for both CPU's.
> Is there a better way to show my sysadmin that both CPU's
> are being used ? He refuses to believe that SQL will
> recognize both CPU's without a reconfiguration.
> TIA,
> Jack
Friday, February 10, 2012
Changing Transaction Logs
I have recently completed the SQL Programming course, on
this course we were advised to have Transaction Log files
(ldf) and database files (ndf) on seperate drives.
On inspection I have found that we need to do this. Can
someone please review the following action plan and advise
if this will work? And if not can you suggest an
alternative: -
Carry out the following changes "outside of Working
hours": -
1. Backup all databases and truncate all Transaction Logs.
2. For Each user Database using Enterprise Manager,
Display Properties, Transaction Log. Select the current
Transaction Log, note the File Propeties and then delete
it.
3. Whilst still in Properties, Transaction Log. Create a
new Transaction Log, replicating the File Properties.
Many Thanks
Tony C.
Hi,
No need to do such complex tasks:-
1. Detach the database (Execute - sp_detach_db <dbname> in Query
Analyzer) -- Now the MDF , NDF and LDF will be removed from SQL server
2. Copy the LDF file to new drive
3. Attach the database back (Execute - SP_ATTACH_DB
<dbname>,'mdf_filename_with_path','Ldf_filename_wi th_path' in Query
Analyzer)
Now the database will be attached back to Sql server.
Note:
Please do this activity when there is no user connected.
Thanks
Hari
MCDBA
"Tony C" <anonymous@.discussions.microsoft.com> wrote in message
news:12a9a01c44304$40bb6690$a501280a@.phx.gbl...
> Hello
> I have recently completed the SQL Programming course, on
> this course we were advised to have Transaction Log files
> (ldf) and database files (ndf) on seperate drives.
> On inspection I have found that we need to do this. Can
> someone please review the following action plan and advise
> if this will work? And if not can you suggest an
> alternative: -
> Carry out the following changes "outside of Working
> hours": -
> 1. Backup all databases and truncate all Transaction Logs.
> 2. For Each user Database using Enterprise Manager,
> Display Properties, Transaction Log. Select the current
> Transaction Log, note the File Propeties and then delete
> it.
> 3. Whilst still in Properties, Transaction Log. Create a
> new Transaction Log, replicating the File Properties.
> Many Thanks
>
> Tony C.
|||Hi Hari
Although your script looks just fine, I don't have the
neccessary NT Network Access to physically move files from
one destination to another... Therefore I will have to
go with my original plan if it is ok... I can run all the
steps needed in SQL Server as I am the SQL Server
Administrator.
Many Thanks Anyway.
Tony C.
>--Original Message--
>Hi,
>No need to do such complex tasks:-
>1. Detach the database (Execute - sp_detach_db <dbname>
in Query
>Analyzer) -- Now the MDF , NDF and LDF will be removed
from SQL server
>2. Copy the LDF file to new drive
>3. Attach the database back (Execute - SP_ATTACH_DB
><dbname>,'mdf_filename_with_path','Ldf_filename_w ith_path'
in Query
>Analyzer)
>Now the database will be attached back to Sql server.
>Note:
>Please do this activity when there is no user connected.
>Thanks
>Hari
>MCDBA
>
>"Tony C" <anonymous@.discussions.microsoft.com> wrote in
message[vbcol=seagreen]
>news:12a9a01c44304$40bb6690$a501280a@.phx.gbl...
files[vbcol=seagreen]
advise[vbcol=seagreen]
Logs.[vbcol=seagreen]
Create a
>
>.
>