HOW TO Use SQL Server DTS Transform Data Task to export to dynamic Excel spreadsheet file

Many people (including me) are often still using SQL 2000 – which means that you might be using DTS (Data Transformation Services) to perform utility jobs on your databases. 
I’ve had a requirement to output data to Excel (not my choice – Excel usually isn’t!) – which is subtly different to text output as CSV*.  Another requirement was to have a ‘generated’ file name in a known folder.  This is as simple as a date/time driven file name.  The ‘job’ runs once (or more) a day and so creates a new file each time.  Easy right?

Well – it turns out that if you go through the normal motions of creating an Excel connection and using a Transform Data Task it’s likely to work first time and leave you thinking ‘great’.  Unfortunately when you run it again you’re likely to get a message similar to the following:

Table ‘New_Table$’ not found. 

After much digging (hence the reason for writing this), I discovered that in order to map the columns successfully you basically have to create the table in the spreadsheet before running the transformation (the ‘create table’ part of creating the transformation).  If you’ve got a dynamic file name then the next time you’ll  effectively have a blank spreadsheet without the required receiving ‘table’, and it will fail, unless you’ve recreated the sheet before the transformation.

OK so assuming you want to output some data to a spreadsheet with a dynamic filename with a Transform Data task then here’s how you do it.

  1. I put all files in the same folder so I use 2 global variables to build the filename – NetworkLocation and FinalLocation
    1. First, Right-click on a blank part of your package screen and choose Package Properties, then Global Variables tab.
    2. Create the two global variables NetworkLocation and FinalLocation
    3. Edit the NetworkLocation value to be the folder (including a final ‘\’) of where you want to store the files.
    4. Leave FinalLocation blank (we’ll create that in the next step.
  2. Add an ActiveX Script Task with the following code (which you can
    obviously tailor the name to your needs).  It basically creates a
    date/time based file name, and stores a concatenation of the network
    location and file name in the FinalLocation global variable.

    ‘**********************************************************************
    ‘  Visual Basic ActiveX Script
    ‘************************************************************************
    Function Main()

        Dim fileName
        fileName = “MyOutput_” & DatePart(“yyyy”,Date) & Right(“0″ & DatePart(“m”,Date), 2) & Right(“0″ & DatePart(“d”,Date), 2) & “_” & Right(“0″ & DatePart(“h”,Now), 2) & Right(“0″ & DatePart(“n”,Now), 2) & Right(“0″ & DatePart(“s”,Now), 2)
        DTSGlobalVariables(“FinalLocation”).Value = DTSGlobalVariables(“NetworkLocation”).Value + fileName + “.xls”

        Main = DTSTaskExecResult_Success

    End Function

  3. Add your data connections
    1. Create your source connection (presumably SQL Server or wherever your data’s coming from)
    2. Create your destination connection (A Microsoft Excel 97-2000
      connection) and set an initial file name (best to be in the folder
      specified in NetworkLocation)
  4. Create a Data Transformation Task connecting your source and destination connections as per normal
    1. Add in your Table/View or SQL Query to define the source data.
    2. In the Destination tab click ‘Create…’ next to ‘Table name:’ (a
      dialog will pop up ‘Create Destination Table’).  Change the table name
      (or column definitions) if you like, then COPY THE ENTIRE TEXT – and
      paste into a text editor for safe keeping.
    3. Click OK (This will create a new sheet in the spreadsheet you’ve
      specified in your connection, ready to receive your data).  Your fields
      should then also be populated in the table below.
    4. Check the transformations tab – you should get a nice one to one mapping between source and destination columns.
  5. Create a Dynamic Properties Task (Here’s where it all comes together)
    1. Click ‘Add…’ to create a new property.  Choose your Excel Connection and pick the ‘DataSource’ property, then click ‘Set…’
    2. Choose ‘Global Variable’ as the source and FinalLocation as the variable – Click OK.
    3. Click ‘Add…’ to create another property.  Choose the
      DTSTask_DTSDataPumpTask_? in the Tasks node (your data transformation
      task) and pick the ‘DestinationObjectName’ property.
    4. Choose ‘Query’ as the source and paste in your ‘CREATE TABLE‘ statement from earlier – Parse to check, then click OK.
  6. Create the Workflow for the task
    1. Select the Dynamic Properties Task, right-click and select Workflow –> Workflow Properties.
    2. Click ‘New’ and set the ActiveX Script Task as ‘Source Step’.
    3. Select the Data Transformation Task, right-click and choose Workflow Properties.
    4. Click ‘New’ and set the Dynamic Properties Task as ‘Source Step’.

That’s basically it – the package should look something like this:

It should also now do the following (in order):

  1. Set filenames (stored in global variables)
  2. Set filename of the excel connection, create the destination ‘sheet’
    from the global variables and the query in the transformation
  3. Do the transformation and store the results in a date/time named Excel spreadsheet in your configured folder

If you ever need to change the query (columns) then you’ll need to regenerate the ‘CREATE TABLE’ statement and reflect it in the Dynamic Properties Task (5.c)

*Typically an Excel export covers your back with datatypes where a CSV export won’t – so when you open in Excel you’ll have to format columns specifically yourself – like converting dates, and also long numbers from scientific format to text format.

Be Sociable, Share!

One thought on “HOW TO Use SQL Server DTS Transform Data Task to export to dynamic Excel spreadsheet file

  • May 24, 2012 at 2:32 pm
    Permalink

    Hi MSIMNER,

    Thanks a lot for this brilliant post. I had been banging my head around this problem for a couple of days now. Excellent procedural explanation of the process and the diagram helped confirm my understanding.

Comments are closed.