Wednesday, January 30, 2008

Differential Backups

There could be different backup methods one needs to make a careful choice of which method would suit best with one's requirement.

1. The simplest and the most widely used methods is the complete backup. In the complete backup a copy of the transaction log is also made this is used to recover the database to the last possible consistent state.

backup database BCG
to disk = 'c:\bcg.bak'

2. The second method is a file backup where one can backup one file at a time instead of backing up all the files in one go. This method is handy when the database size is very huge and backing up all the files would take too long.
FILE = BCG_Data_1
TO DISK = 'C:\BCG_File1.bak'

3. Diffenrential Backup creates a copy of all the changes that have taken place in the database ever since the last complete backup. This command is same as the complete backup except for one differential clause.

BACKUP DATABASE BCG to disk = 'C:\bcg_1.bak' with differential

4. The transaction log backup takes a backup of only the transaction log file once the transaction log is backed up it is marked for truncation and a new log is started thus when multiple file logs are being applied onto a database it is important that all the logs are available and they have to be applied in the same sequence.

BACKUP LOG BCG TO disk = 'C:\BCGLog.log'

How to restore a backup also changes depending the method used for backing up the database. The simplest of all is the full restore by default a recover mode is what is used by SQL recover mode means that SQL is done with the restore it will roll back all the incomplete transactions in the database and bring the database to a consistent state.

restore database BCGNew from disk = 'C:\bcg.bak'

In case a differential backup or transaction logs are being applied onto a backup we would not like the database to recover and roll back after the restore is done as its quite possible that the transaction was committed in the following file thus with the restore option the norecover clause is used.

backup database BCG to disk = 'c:\bcg_1.bak' with differential

TO disk = 'C:\BCGLog.log'

Assuming that between each of these commands there are changes and transaction that have been committed into the database. What is expected is that when the new database is restored all these changes are available. The sequence to recover the data to the last possible state would be as follows:

restore database BCGNew
from disk = 'C:\bcg.bak'
with norecovery
, move 'BCGWDCM_GLC_Data' to 'C:\BCGNew_data'
, move 'BCG_data_1' to 'C:\BCG_data_1'
, move 'BCGWDCM_GLC_Log' to 'C:\BCGWDCM_GLC_Log'

restore database BCGNew
from disk = 'C:\bcg_1.bak'
with norecovery

restore log BCGnew
from disk = 'C:\BCGLog.log'

The point to be noted here is that norecover clause is being supplied with each restore option but the last. Once a restore is done without the norecover option no more restores would be allowed on the database.

Tuesday, January 29, 2008

Database Backups and Restore

The SQL database is logically divide into filegroups. A logical filegroup in SQL maps to a set of physical files on the disk. It is mandatory to have atleast one filegroup called PRIMARY and it is created by default. New filegroups can be created by users if required


A filegroup can be considered as a logical storage unit to house database objects which map to one or multiple filesystem files. New filegroups can be easily added to an existing database to partition it.

ADD FILE (NAME='BCG_data_1', FILENAME='C:\BCG_data_1.ndf

Filegroups as stored in a system catalogue called sysfilegroups so a list can be easily obtained
select * from sysfilegroups

To move an existing table to a newly created filegroup a clustered index can be created as the base table is always stored with the clustered index so moving the clustered index also moves the table.

CREATE CLUSTERED INDEX IDX_ProductID ON dbo.OrdersDetail (ProductID)
ON masters

Another great advantage of using filegroups apart from partitioning the database is that advanced restore options make it possible to restore just one filegroup from the available ones in a database. This allows a partial recovery of the SQL database which comes very handy when the database is huge.

Irrespective of the filegroup being restored the primary filegroup is always restored as the catalogs are stored on the primary filegroup (except full-text catalogs) they too are always restored and all the ones not available are marked as offline.

backup database BCG
to disk = 'c:\bcg.bak'

This command will list all the files and filegroups in the backup file
restore filelistonly
from disk = 'c:\bcg.bak'

This command will list all the backup sets in the backup. A backup can contain more then one backup sets if a backup is applied on a backup file the file is not replaced instead the new backup is added onto the file with a new file number

restore headeronly
from disk = 'c:\bcg.bak'

which backupset to restore from the backup file can be specified using the file=number clause in the restore statement. This clause should appear after the with keyword in the restore command

A partial recovery is indicated using the partial clause in the restore command as shown below.
restore database BCGNew
filegroup = 'primary'
from disk = 'C:\bcg.bak'
with partial, recovery
, move 'BCGWDCM_GLC_Data' to 'C:\BCGNew_data'
, move 'BCG_data_1' to 'C:\BCG_data_1'
, move 'BCGWDCM_GLC_Log' to 'C:\BCGWDCM_GLC_Log'

When a partial recovery is made although all the tables are created only the tables on the filegroup being restored are available to the user all other tables are marked as offline.

Tuesday, January 22, 2008

Delete the Zero Byte Files

Well we had this DTS task that would pump files from a SQL source the problem was if it didnt find any valid record to be exported it would still create a zero byte file the idea was to delete all these files with zero byte size.

Option Explicit

Sub Main
Dim Folder

Set FSO = CreateObject("Scripting.FileSystemObject")
Set Folder = FSO.GetFolder("D:\My Documents\AGIS\DHL\06 SAMAR")

Call CheckSubFolders(Folder)
End Sub

Sub CheckSubFolders(Folder)
Dim SubFolders
Dim SubFolder
Dim Files
Dim File

Set Files = Folder.Files

If Files.Count <> 0 Then
For Each File In Files
If File.Size = 0 Then
WScript.Echo "File " + File.Path + " should be deleted."
End If
End If

Set SubFolders = Folder.SubFolders
If SubFolders.Count <> 0 Then
For Each SubFolder In SubFolders
Call CheckSubFolders(SubFolder)
End If
End Sub

Friday, January 11, 2008

Dynamically Setting properties for DTS tasks

DTS is a workflow application designed by microsoft for as the name suggests Data Transformation Services these can be extensively used in data import and export routines to multiple data sources.

I recently has this project where we had to export data to certain folders on an FTP folder on a preset time interval. Like every 10 minutes we have to create file for all the sales orders that had been shipped in that interval. First we thought of creating an application to do this but then as we were short of time we thought of using the inbuilt DTS services to achieve the same.

Designing the workflow was an easy task the issue was how to dynamically rename the file being created every 10 min we wanted a mechanism to create the text file name based on the server time stamp. Finally we knew that we could used the Active script to achieve the same.

The script we used was pretty simple below is the code.

Function Main()
DTSGlobalVariables("SaleLineFileName").Value = "C:\Tmp\" & Year(Now()) & "-" & Month(Now()) & "-" & Day(Now()) & "-" & Hour(Time()) & "-" &Minute(Time()) & "-" & Second(Time()) & ".txt"

Set pkg = DTSGlobalVariables.Parent
Set conTextFile = pkg.Connections("salesline")
conTextFile.DataSource = DTSGlobalVariables("SaleLineFileName").Value

Main = DTSTaskExecResult_Success
End Function

Another requirement was to read a no of lines from a folder and to import all the lines that existed there below is the script

Function Main()

'Defines the folder to read filenames from
sFolder = "D:\Folder\"

'Defines an object variable for the package
Set pkg = DTSGlobalVariables.Parent

'Defines the connection for the text file source on which you will read for the data import
'This is what we will be setting as a dynamic value
Set conTextFile = pkg.Connections("Text File (Source)")

'Defines the object variable for the Transform Data Task that transforms the data from the text file to the database
Set pumpTask = pkg.Steps("DTSStep_DTSDataPumpTask_1")

'This is where we will read through the filenames
Set oFSO = CreateObject("Scripting.FileSystemObject")
Set folder = oFSO.GetFolder(sFolder)
Set files = folder.Files

'Iterate through all the files in the folder and retrieve their filenames
For each itemFiles In files
'Use a variable for the complete path of the file
sFileName=sFolder & itemFiles.Name

'Assign the Data Source of the text file
conTextFile.DataSource = sFileName

'Execute the Transform Data Task

'Assign the path of the filename to a global variable. This is needed as you will have to
'define a global variable for your text file source in you DTS package
DTSGlobalVariables("importFilename").Value = sFileName

'Returns a value to the task that the execution was successful
Main = DTSTaskExecResult_Success

End Function

Tuesday, January 08, 2008

Returning Arrays in VB Functions

Well yes one can return arrays in VB functions but internally the arrays are still passed back as Variants or objects.

Thus a function definition like the one below:-

Public Function WeekDays() as String()
WeekDays = Array( "Sun", "Mon", "Tue", "Wed", "Thu", "Fri", "Sat" )
End Function

is perfectly valid as this is a function which would return a string array. However as internally arrays are handled as objects or variants the following statement is not possible :

Msgbox "The first day of the Week is " & WeekDays(0)

Instead what one would be expected to do is this :-
Dim strArr() as string
strArr = WeekDays()
Msgbox "The first day of the Week is " & strArr(0)