DTS: advice on dynamically specifying source db/tables

  • I would like to have one package with a dynamic task that basically has two connections: source and destination. I will use a sql or transform task to pull an entire table or multiple tables. Once created, suppose I need to import 5 or 15 tables. I would like to be execute a command in Query Analyzer or a bat file or command file, passing it the source db, destination db, and the list of tables.

    I have tried a simple example with global variables. However, I was not successfull in passing a table. The problem was it would not update the field mappings in the transform data task for those of the newly specified table. It would update the source fields but not the destinatin field list. Thus it would fail.

    Any details or code samples or advice would be appreciated. Thanks.

     


    smv929

  • I cannot see how this can be done from table to table. You are relying on the dts package knowing which fields to join for starters. If you wanted to put the results into a text file, you can dynamically populate table and text file name.

    However, if you use the readconfig.ini file to change the database name you may have some joy as long as the two tables are identical to each other in the two databases. I have nver tries it, though.


    ------------------------------
    The Users are always right - when I'm not wrong!

  • <...cannot see how this can be done from table to table. You are relying on the dts package knowing which fields to join for starters.>

    I was hoping there was a setting in the dynamic task or somewhere in the designer to tell it what fields.

    <...If you wanted to put the results into a text file, you can dynamically populate table and text file name.>

    Can you be more specific? Using which task?

    <However, if you use the readconfig.ini file to change the database name you may have some joy as long as the two tables are identical to each other in the two databases.>

    The tables are identical among databases. I would prefer to use global variables. Otherwise, I would have to have edit a config file for each database. With a global variable, I could have a series of dtsrun commands. If I can't specify the table name dynamically due to the field mapping error, then perhaps I can create a package for each table (yuk!) and then just have the data source and destination be input via global variable. Then this would be the series of scripts I'd run when I needed, say, tables1,2 and 3:

    To populate db1...

    dtsrun..PackageForTable1....Source:=odbc1....Dest:=sqlDb1

    dtsrun..PackageForTable2....Source:=odbc1....Dest:=sqlDb1

    dtsrun..PackageForTable3....Source:=odbc1....Dest:=sqlDb1

    To populate db2...

    dtsrun..PackageForTable1....Source:=odbc2....Dest:=sqlDb2

    dtsrun..PackageForTable2....Source:=odbc2....Dest:=sqlDb2

    dtsrun..PackageForTable3....Source:=odbc2....Dest:=sqlDb2

    etc..

    Is there a better way? Thanks

     


    smv929

  • The answer to 1 would be to look for a good dts book.

    The answer to 2 is to use an activex script. You can set source and destination fields dynamically by using global variables.

    The answer to question 3 is that you are quite right. You would need to dynamically change the ini file, which is a separate matter altogether.

    In case it helps you I have a script that hets data from a table and populates it into a spreadsheet. You can tinker about with it and find the property to change the source. It will work dynamically, as per your requirement:

    '**********************************************************************

    ' Visual Basic ActiveX Script

    '************************************************************************

    Function Main()

    ' Initiate variables

    Dim appExcel

    Dim newBook

    Dim oSheet

    Dim path

    Dim oPackage

    Dim oConn

    ' Set variable values

    Set appExcel = CreateObject("Excel.Application")

    Set newBook = appExcel.Workbooks.Add

    Set oSheet = newBook.Worksheets(1)

    path = "C:\...."

    'Specify the column name in the Excel worksheet

    oSheet.Range("A1").Value = "Invoice Number"

    oSheet.Range("B1").Value = "Purch Ref"

    oSheet.Range("C1").Value = "Patient Ref"

    oSheet.Range("D1").Value = "Description"

    oSheet.Range("E1").Value = "Raised Date"

    oSheet.Range("F1").Value = "Type"

    'Expand the columns to fit text

    oSheet.Columns(1).ColumnWidth = 15

    oSheet.Columns(2).ColumnWidth = 15

    oSheet.Columns(3).ColumnWidth = 15

    oSheet.Columns(4).ColumnWidth = 50

    oSheet.Columns(5).ColumnWidth = 15

    oSheet.Columns(6).ColumnWidth = 15

    'Format the date field

    'oSheet.Columns(5).NumberFormat = "dd-mmm-yy"

    'Specify the name of the new Excel file to be created

    DTSGlobalVariables("fileName").Value = path & Year(Now) & "-" & Month(Now) & "-" & Day(Now) & ".xls"

    With newBook

    .SaveAs DTSGlobalVariables("fileName").Value

    .save

    End With

    appExcel.quit

    'dynamically specify the destination Excel file

    set oPackage = DTSGlobalVariables.parent

    'connection 2 is to the Excel file

    set oConn = oPackage.connections(1)

    oConn.datasource = DTSGlobalVariables("fileName").Value

    set oPackage = nothing

    set oConn = nothing

    Main = DTSTaskExecResult_Success

    End Function


    ------------------------------
    The Users are always right - when I'm not wrong!

  • I needed a similar function where I used DTS to let users of a VB.Net application import Excel or text files into tables. 

    Instead of writing scripts for each table, I decided to write a general import tool which uses DTS COM model to create the script on the fly.  The code reads the schema for both the input file and the dest table and builds the script necessary to import the file.

    The best way to go about this is to create a package, then save it as VB.  Assuming you are VB proficient, analyze the code and you will see where you can "genericize" the package.  The code is easily to follow and fairly linear.

    Once it has created the package in memory, you can execute it or save it to SQL Server for later execution.

    I don't have access to the code anymore so I cannot offer it up as a solution or example.

    - manash

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply