Dynamically setting Mappings?

  • Hi All

    Is there any way to set dynamic mappings in 'Data Flow Task'

    That is, I am setting 'source' and 'destination' tables (OLDB Tasks) from Package Variables...

    Since the structure of these tables are varying, I have to set the mappings dynamically..

    Any way to achieve this through 'Script Task'  or Any other ways?

    Regards,

    Agson Chellakudam

  • You cannot change the metadata (i.e. column info) of a data-flow at runtime.

    -Jamie

     

  • Hi All

    After going through some links and trying myself I understood that, ther is no easy and direct way to handle this situation..

    But I noticed a solution for a particular situation..

    ie for me column names and number of columns are same for all the tables...

    Only the varying factor is, its data types..

    I will explain with an Example

    Consider two tables, Source_Table(Col1 varchar(10)) and Dest_Table(col1 varchar(10))

    I set these table names to two variables 'Source_Variable' and 'Dest_Variable'..

    OLEDB Source and Destination Tasks are picking the respective table name from these variables

    Package Executed Successfully...

    Now Consider other two tables, Source_Table_Sec(col1 varchar(20)) and Dest_Table_Sec(col1 varchar(20))

    Same variables are initialized with these table names...

    While executing the package, I am getting a truncation error, because of  Source_Table_Sec.col1 data length is > 10 (it is remembering previous metadata)

    To handle this situation, I redirected all the raws(using Error Output of Source OLEDB) to another OLEDB Destination which will pick the table name form same variable (Dest_Variable)

    So I am able to insert all the data to destination table woth out failing the component...

    Regards

    Agson Chellakudam

     

     

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply