Using Data Transformation Services: Package Execution Using SQL Server Agent, Introduction

The scenario is one of creating a DTS package that populates a table in an MS Access database from an SQL Server database on a schedule. The scheduling is to be carried out by creating an SQL Job. MS Access is often used for creating reports, and this scheduling makes it easier to create updated reports using data from an SQL Server. This question of scheduling dts packages often shows up in SQL forums.

In previous tutorials, step-by-step methods of creating and executing DTS packages with ImportExport Wizards as well as using the DTS Designer has been described in considerable detail. Executing the DTS Packages designed and saved on the SQL Server from ASP.NET using the Visual Studio 2003 has also been treated in an earlier article.

This main focus of this tutorial is to guide the user in scheduling the execution of the DTS Packages by creating an SQL Job. In order to complete the tasks in this tutorial the reader is advised to review the previous tutorials on this topic, available online here at ASP Free.

We will follow the steps indicated here:

Part A: Creating the Package
  • Create a dts package
  • Create an SQL Connection
  • Create an MS Access Connection
  • Create a Transform Data Task
      This task populates a table in MS Access with data in a table in the SQL 2000 Server database.
  • Create an Execute SQL Task
  • Add Workflow to Package
  • Save the Package
Part B: Scheduling the Package
  • Create a SQL Job
  • Configure the Job for Scheduling
  • Save the Job
{mospagebreak title=Part A: Creating the Package}

A package called DeleteAnAcessTable is created and the necessary connections, tasks, and workflow are configured. We highlight a couple of essential steps and skip the detailed description. The description is available in earlier tutorials. Screen shots are shown for the essential steps.

Creating the package: Details of the two connections

There are two connections, one to SQL Server 2000 and the other to the MS Access database in the package DeleteAnAcessTable. All that is needed to set up a connection is to drag the connection control from the connection collection and drop it on the designer.

The next two pictures show the two connections. The first one is the source (SQL 2000 Server), and the second is the destination (MS Access database at its default location).

Creating an SQLConnectionSQL 2000 Connection

Creating an MS Access Connection
MS Access

The table in the SQL 2000 Server whose content populates the MS Access table has the following structure.

The design view of the destination table in MS Access has the following information:

{mospagebreak title=Create a Transform Data Task}

The data transformation task is very simple, copying from one table to another. The source table is the SQL Server database table and the destination table is the MS Access table. Herein, only the Transformations tab of the Transform Data Task is shown:

Create an Execute SQL Task

The business objective requires that the MS Access table is deleted and a new one is created before the data from SQL Server is added. In order to carry this out, a Execute SQL Task is added. This task should drop the table and recreate the table with the same design features. The next picture shows the features of such a task.

Add Workflow to Package

The workflow is needed because the MS Access table should exist before the data can be pumped from the SQL Server. Hence only when the Execute SQL task succeeds will the data be transferred from SQL Server to MS Access, and not otherwise. The Workflow properties page (obtained by right clicking the workflow) shows the information in the next picture.

Save the package

This completes the design of the package that deletes a table, recreates the table, and then populates it with data from SQL Server. This is saved to the SQL 2000 server, and the next two pictures show the details of this package.

{mospagebreak title=Part B: Scheduling the Package: Create a SQL Job}

Saved DTS packages can be run by several means, which include invoking the dtsrunui from the command prompt. But in this section, a method using the SQL Server Agent is described. In order to access the SQL Server Agent, expand the management node of the SQL Server. This gives access to the SQL Server Agent.

This comprises the collections, Alerts, Operators, and Jobs. In order to keep the discussion simple only the Jobs will be discussed. But in a production environment all of them are necessary for successful management of the database. For the purposes of this tutorial a new job is created, either from right clicking Jobs as shown, or from the menu item (the icon is similar to the one for job, but shows the pop-up text ‘New Job’).

Configuring and Scheduling the Job

This creates a new job on the SQL Server which must be configured for the job at hand. It comes with several tabs, General, Steps, Schedules, and Notifications as shown. The General tab is configured with the required information, such as the name for the job, a description of its purpose, whether it is enabled or not, whether it runs on one server or several, and so on.

In the second tab, Steps, it is necessary to describe the various steps in this job. These steps are created new, but can be inserted, deleted, or edited. If it is already created (true in this case), its property can be inferred by editing it as shown.

The name of the step is Only Step. In this step an operating system command to the SQL Server is given, which runs the package for a server with Windows Integrated authentication (switch /e). The switch /N is for the package name and follows it, as shown above.

In the Schedules tab, the actual scheduling is carried out. It has several fields that need configuring. In this tutorial, it is run just once at a predetermined time as shown. However, depending on the requirements, scheduling can be flexibly implemented.

This leaves this tab in the following state. If need be, this can be edited, or changed.

In the next tab, Notifications, messages are sent to operators (email, etc). The next picture shows the Notifications window. The job, if it succeeds, writes to the Windows event log.

Save the job

This completes the configuration of the job. When it is saved, a related icon appears in the listing of jobs in the Enterprise Manager at the Job’s node as shown.

By right clicking the line item of the job in the listing, several of the job related tasks, such as start and stop, access job history, and so on, can be carried out. The next picture shows the job history for this tutorial’s job configured above.


Only the simplest of the job scheduling, with just one step, carrying out a one time execution, was shown. Not only the jobs can have many steps, but the SQL Server agent can be started in various ways, automatically, whenever the CPU is idle, one time as in this tutorial, or recurring according to a set pattern.

9 thoughts on “Using Data Transformation Services: Package Execution Using SQL Server Agent, Introduction

  1. I thank you for reading through the article, assuming you already have. Do you have any comments on this article? Did it help you in any way? Did it raise any questions? Do you want something else covered in this topic? I have written quite a few other articles on DTS. You may also comment on any of the other articles as well. DTS is an excellent tool, and the designer is the easiest way to do most of the tasks. At least this has been my experience. 🙂

  2. Hi Jayaram
    The article is excellent – it gives the step by step guide for creating the package from very beginning to the end .Since it also shows the screenshots of the individual steps there is no problem whatsoever .With the help of your guidance even a novice can create the package .
    But I have a query about the scenario .Here you are creating a package that is loading the SqlServer data into Access .But if I have ~300 tables to be transferred ,then I have to create as many packages ? How we can have a generic package that will accept the tablename at runtime from a global variable and do the needful .Obviously the the corresponding table names and schema will remain the same .What type of task is needed in this case and how both the source and destinations can be assigned at runtime ?

    Will you be kind enough in answering back to this question ?

    Nirmalya Sinha

  3. I do not have an immediate, complete answer. I can think of two possibilities, firstly to try from the DTS Object Model with .INI file, and secondly using a template dts package as a linked server and issue a distributed query probably incorporating a Dynamic Properties task. I may come up with an answer but probably not soon.:'(

  4. Read your is really nice as you have given step-by-step execution.I want to ask one thing,If I have two SQL servers one server is resided on host providers side and another is on client local server and I want to transfer data(table) from Live to local after some time duration can I do this using DTS package execution?

    Please comment taht will help me a lot.


  5. Ok I will explain in detail.

    I have one intranet(Offline) application for some order processing system.In this system when a new order is inserted gets added to order table of local database.Now I want to make enhancement in this application by adding a facility to insert order online.For that I will create a new online web application which will somwaht work as shopping cart application.Cutomer will select a those many products they want and places an order.This this order information gets stored on Live Database resided on hosting providers side.
    Then after each 24 hrs.(Any fixed duration) I want to transfer this order table of Live DB to Local Database of intranet application.Can I use DTS for this?
    If in future no. of tables tranfering from live to local increases will DTS will be useful ?

    please let me know if more details required.


  6. I think if you use DTS for the transfer and schedule the job nightly you may be able to achieve what you want. May be you may need to carefully design the DTS task. I hope this was useful.



  7. Thanks for your reply!
    One question…Will DTS work (best to use?) if my live Sql Server is on Hosting Provider’s side (ISP) and my local sql sever is on client side.Both these servers are not in same network env.?


  8. My best guess is that it depends on what you can, and cannot do on hosting provider’s server. They usually do not give too much room for you to act independetly. Please send them a query defining what you want to do.
    I only have access to an mdb file on my hosting provider’s server, otherwise I would have attempted and would have known the answer to your question.
    Sorry, I cannot give a positive answer.



[gp-comments width="770" linklove="off" ]