Since the release of SQL Server 2005, the spotlight has been on SQL Server Integration Services(SSIS) as it is the new and powerful ETL tool (short for extraction, transforming and loading). SSIS is making managing SQL Server data a more customized and flexible task. Yet, DTS woes continue to exist for those who have, and have not migrated, to SQL Server 2005. We've compiled a list of 10 frequently asked questions involving DTS, which should ease your troubles and also show you new ways to utilize this tool.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Frequently Asked Questions: DTS packages in SQL Server
1. What are the differences between DTS and SSIS?
SSIS offers so much more than DTS. Rather than view it as the next version of DTS, I view it instead as version 1 of a new product.
Explaining the differences between SQL Server 2000's DTS and SQL Server 2005's Integration Services requires either a book or an entire dedicated Web site. In fact, I just purchased Professional SQL Server 2005 Integration Services by Wrox and it looks great. I've also got some great examples and tons of information including many sample packages on sqlis.com. Finally, you can also find webcasts, articles and blogs by the Microsoft SSIS development/management team here.
2. Is there a DTS package designed to overwrite extract.xls?
You can easily delete a file using a script command. See the following example at www.sqldts.com.
You may consider creating a very simple script task that performs the delete. Then have your DTS package flow move to the data pump section when the delete succeeds.
3. How can I prevent DTS package failures?
How can I make a job successful when it contains a DTS package that transfers data from a text file to a SQL Server table and updates the table? The problem is the job should run daily but the file may not always be there.
If I understand correctly, you want your DTS package not to fail just because your text file does not exist. You may consider working with files and the FileSystemObject and ActiveX Script Tasks. Have your DTS package include an ActiveX Script task that verifies the file exists. If the file does not exist, the DTS package will be done. If the file does exist an 'On Success' flow executes your data pump transferring the data from it to the SQL Server tables. In the following example, the file name is stored in a global variable.
Function Main() Dim oFSO, sFileName ' Get the name of the file from the global variable "ImportFileName" sFilename = DTSGlobalVariables("ImportFileName").Value Set oFSO = CreateObject("Scripting.FileSystemObject") ' Check for file and return appropriate result If oFSO.FileExists(sFilename) Then Main = DTSStepScriptResult_ExecuteTask Else Main = DTSStepScriptResult_DontExecuteTask End If Set oFSO = Nothing End Function
4. How should I handle metadata processes for SSIS projects?
It turns out the SSIS defines system package variables you have read-only access to that contain the exact data you are looking for. In fact, Donald Farmer (Microsoft SSIS Guru) created a DTS package that appends this data to an audit table. You can view all of this information in his article 'Counting Rows in SQL Server Integration Services'.
As you will see, some of the data that can be captured includes: package name, machine name, username, ErrorRows, GoodRows and Execution start time. Take a look in SSIS and view all of the System Defined Package variables.
More on SSIS and DTS in SQL Server
5. What are some ways to efficiently push and pull data in SQL Server?
I was able to access Sybase ASE Data at one client site provided I installed the Sybase Client drivers on my machine running SQL Server. If I remember correctly, this provided me with Sybase ASE ODBC Drivers. I accessed Sybase data both through replication and in DTS Packages under SQL Server 2000.
6. How can I find and delete file names?
I am trying to get a list of file names with .bak and .trn extensions from a specific path (say D: Test folder), which contains files. I want to search and delete the file names with .bak or .trn extensions with old dates. How is this done in T-SQL or Data Transformation Services (DTS)?
You may consider working with files and the FileSystemObject and ActiveX Script Tasks. Here's some sample code:
' Sample code that Copies A File Option Explicit Function Main() Dim oFSO Dim sSourceFile Dim sDestinationFile Set oFSO = CreateObject("Scripting.FileSystemObject") sSourceFile = "C:SourceFile.txt" sDestinationFile = "C:DestinationFile.txt" oFSO.CopyFile sSourceFile, sDestinationFile ' Clean Up Set oFSO = Nothing Main = DTSTaskExecResult_Success End Function ' Sample code that Moves a File Option Explicit Function Main() Dim oFSO Dim sSourceFile Dim sDestinationFile Set oFSO = CreateObject("Scripting.FileSystemObject") sSourceFile = "C:SourceFile.txt" sDestinationFile = "C:FolderDestinationFile.txt" oFSO.MoveFile sSourceFile, sDestinationFile ' Clean Up Set oFSO = Nothing Main = DTSTaskExecResult_Success End Function Here's one that may help you. Check the File Date & Time Option Explicit Function Main() Dim oFSO Dim oFile Dim sSourceFile Set oFSO = CreateObject("Scripting.FileSystemObject") sSourceFile = DTSGlobalVariables("SourceFileName").Value Set oFile = oFSO.GetFile(sSourceFile) If oFile.DateCreated < Date Then Main = DTSTaskExecResult_Success Else Main = DTSTaskExecResult_Failure End If ' Clean Up Set oFile = Nothing Set oFSO = Nothing End Function
7. Is there a way to use SQLMail to send query results?
I am new to SQL Server and have created a Data Transformation Service (DTS) package, which outputs query results. I want to run the query every morning and send the results in email. How should I go about it? Do I need to use SQL Agent?
You will need SQL Agent. I would recommend that you integrate SQL Server with Exchange or another MAPI email system. If your email system is Exchange your task is easy. SQLMail is essentially the integration of SQL Server & Exchange. You can set it up by following the 'SQL Mail, Setting up' link in Books Online (BOL). Once this is done, you can send mail using operators you define under SQL Server Agent or you can send it as Transact-SQL steps in your DTS package using xp_sendmail.
8. What are some ways to perform incremental loads and extract data?
I've had to perform incremental loads when extracting data to populate a data mart. To do this I wrote extraction stored procedures that were executed as a SQL task within my Data Transformation Service (DTS) packages. Our source data tables were designed to allow this to happen quite easily. Here are the columns in each source table that allowed us to easily perform an incremental load/extract:
Identity Column as PK or Unique Constraint (int column)
This column allowed us to see if a row in the source was new or already existed by comparing the source ID value with those that exist in the destination.
LastUpdateDate (DateTime Column)
This column was automatically updated by our application. If we found that our row was an existing row (based on the identity column value above) we were able to compare the source LastUpdateDate column with the DestinationUpdateDate column to see if the row's data changed since the last extraction.
To make things easier for us, we created a table that held the current Id value and current update date values for each table we extracted. This allowed for easy comparison during each load and it turned out that this table was also heavily reported against. (Reports contained the extract datetime so report viewers could clearly see how current the data was.)
If you do not have the luxury of having these columns present in your source you may have to key off of the source table's primary key. You may also have to perform comparisons of the columns that can potentially change.
9. Can I generate multiple flat files from a single flat file source?i
It's times like this that I miss some of my favorite Unix utilities. Unix supported the split command, which did exactly what you are looking for! You may want to consider creating a DTS package with the Custom Active Script task, in which you use the Scripting Runtime library's FileSystemObject to split a large text file into multiple small files.
10. Is there a way to copy views from one database server to another?
I would recommend that you create a DTS Package to transfer the object. There was a 'Copy SQL Server Object Task' that was made exactly for this purpose. In this task you specify the source and destination servers, the source and destination databases along with the object(s) you wish to transfer. You will have to specify the exact database object(s) you wish to transfer in the copy tab of this task.
Create the DTS Package and verify that you can execute it in DTS Design
Mode. Verify the view was indeed copied to the remote server. You may have to drop the destination object each time the package is executed. Save the package and then you can have the SQL Server Job scheduler automatically kick-start the package by right-clicking on the newly created package and choosing Schedule Package option. From here, you can provide the job scheduling details.
One additional point I'd like to make. I'm confused by your stating that the size of the view can be quite large over time. Views, unlike tables, do not hold data. They are simply named SQL statements we create to perhaps simplify the end-users perception of data in our database. I've also used views for security purposes to hide or exclude sensitive columns. Views, therefore, are not large because they are simply database objects that hold SQL Statements. Granted, the table(s) that view 'hits' can certainly be large.
ABOUT OUR EXPERT:
Joe Toscano of RDA is a SQL Server database consultant, instructor and speaker based near Philadelphia, Pa. He has worked extensively with SQL Server since version 6.5, as well as Sybase and Informix prior to that. Joe's areas of expertise include Data Transformation Services (DTS), Integration Services (SSIS), Transact-SQL (T-SQL) programming, data warehousing, performance tuning, replication, high availability (mirroring), security, disaster recovery, upgrading, migration projects and lights-out DBA tasks.