Dynamic column names for insert statement?

  • Hi, I would like to provide the names of columns in an insert statement from a schema table, so that when running through a number of Bus Rule checks I can reference the schema table and only maintain the columns in the schema table rather than maintain named columns in multiple insert statements. So my query for one check looks like below. I'm using dynamic sql to execute the insert statement. My question is, is there a better way or different way to do this without using dynamic sql? Ie, Is there a way that I can use the columns parameter like this instead?

    Insert

    ('+@columns+',KickoutID) Values('+@columns+',1);

    Dynamic code:

    Declare @columns as nvarchar(max);

    Declare @InvSQL as nvarchar(max);

    SELECT @columns =

    STUFF ((

    SELECT ', [' + name + ']'

    FROM syscolumns WHERE id = OBJECT_ID('dbo.table_pvt')

    FOR XML PATH('')), 1, 1, '')

    print @columns

    set @InvSQL =

    N'Merge dbo.table_Kickout as T

    Using dbo.table_pvt as S

    on T.ID = S.ID

    When not matched By Target

    And S.InvFlag = 0 or S.InvFlag is null Then

    Insert

    ('+@columns+',KickoutID) Values('+@columns+',1);'

    print @InvSQL

    Exec sp_executesql @InvSQL

  • This scream of poor design to me. How can you not know what columns are in your table? If you go down this path I don't think there is any way you are going to avoid being open to sql injection attack.

    _______________________________________________________________

    Need help? Help us help you.

    Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

    Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.

    Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
    Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
    Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
    Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/

  • Hi Sean, I am receiving data from many different types of file sources. The data from the each source is being transformed into a comma delimited string with as a key=value paring, the data can have anywhere from some or all of the data that needs to be in the final transformation. I have a schema table that has all the columns in the final calculated table for each data source so if a source only has a few of the items I can split the row string into the number of rows I need in my final table by referencing the schema table. Once the spit table is built with all of the rows I need, I can dynamically pivot it and apply the business logic. I would like to only maintain the schema table rather than each insert statement when I kickout invalid orders as an example. So this was where I was going with my solution. The reason I posted the question was because like you said this may not be the best design...so I'm asking for solutions for better design.

  • Marcus Farrugia (4/9/2015)


    Hi Sean, I am receiving data from many different types of file sources. The data from the each source is being transformed into a comma delimited string with as a key=value paring, the data can have anywhere from some or all of the data that needs to be in the final transformation. I have a schema table that has all the columns in the final calculated table for each data source so if a source only has a few of the items I can split the row string into the number of rows I need in my final table by referencing the schema table. Once the spit table is built with all of the rows I need, I can dynamically pivot it and apply the business logic. I would like to only maintain the schema table rather than each insert statement when I kickout invalid orders as an example. So this was where I was going with my solution. The reason I posted the question was because like you said this may not be the best design...so I'm asking for solutions for better design.

    Ahh that makes a lot more sense. I am no expert on integration but does each source have a definitive layout? Where I am going with that is that you could load this into a staging table regardless of the source and then have a specific procedure for each source that you could call dynamically. That would allow lots of flexibility but also give you granular control over each source. I am kind of flying blind here of course as I am sure there are lots of other things going on that I don't know about.

    _______________________________________________________________

    Need help? Help us help you.

    Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

    Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.

    Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
    Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
    Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
    Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/

  • Right now we're only working with a few internal data sources that we can control. The final solution will have files from multiple outside clients who will likely be ftp'ing into a folder their data which can be text, excel, csv etc ... so the data sources coming in will be jagged and changing in terms client data format and new and departing clients. So the idea is to transform the files into a comma delim string and then match the available columns that the client provided with our 'master' column list and format against that. The master list can change over time depending on bus needs so would like to reference the columns in that table as a variable string and maintain that rather than maintain multiple column lists in the insert statements. Hope that make sense.

  • This sounds like a job for SSIS and I can barely spell it. Let me call in some reinforcements.

    _______________________________________________________________

    Need help? Help us help you.

    Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

    Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.

    Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
    Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
    Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
    Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/

  • Just as a suggestion that might not solve anything.

    I'd recommend you to use QUOTENAME() instead of inserting the brackets yourself.

    CREATE TABLE dbo.Areyoukidding(

    My_ID int,

    "Why]Not?" varchar(15),

    "Don'tMess[With]me" numeric(18,8))

    Declare @columns as nvarchar(max);

    SELECT @columns =

    STUFF ((

    SELECT ', ' + QUOTENAME( name)

    FROM syscolumns WHERE id = OBJECT_ID('dbo.Areyoukidding')

    FOR XML PATH('')), 1, 1, '')

    print @columns

    SELECT @columns =

    STUFF ((

    SELECT ', [' + name + ']'

    FROM syscolumns WHERE id = OBJECT_ID('dbo.Areyoukidding')

    FOR XML PATH('')), 1, 1, '')

    print @columns

    GO

    DROP TABLE AreyouKidding

    Luis C.
    General Disclaimer:
    Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

    How to post data/code on a forum to get the best help: Option 1 / Option 2
  • Marcus Farrugia (4/9/2015)


    Right now we're only working with a few internal data sources that we can control. The final solution will have files from multiple outside clients who will likely be ftp'ing into a folder their data which can be text, excel, csv etc ... so the data sources coming in will be jagged and changing in terms client data format and new and departing clients. So the idea is to transform the files into a comma delim string and then match the available columns that the client provided with our 'master' column list and format against that. The master list can change over time depending on bus needs so would like to reference the columns in that table as a variable string and maintain that rather than maintain multiple column lists in the insert statements. Hope that make sense.

    I agree that this sounds like a job for SSIS, but also the staging table for the data is a wonderful idea.

    If it were me, I'd have my package set up with a lot of EXECUTE T-SQL tasks.

    IF (SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES

    WHERE TABLE_NAME = 'StagingFiles') IS NULL

    CREATE TABLE dbo.StagingFiles (TableName VARCHAR(50), mydata VARCHAR(MAX));

    -- Create your table via an SSMS script because you can just truncate it later

    IF (SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES

    WHERE TABLE_NAME = 'TableColumns') IS NULL

    CREATE TABLE dbo.TableColumns (TableName VARCHAR(50), ColumnList VARCHAR(MAX));

    INSERT INTO dbo.TableColumns (TableName)

    SELECT Table_Name

    FROM INFORMATION_SCHEMA.TABLES

    WHERE Table_Name IN ('table1','table2','table3');

    --There's code that I have to find that puts a string together

    -- using Tally table. I'll look for it in a bit. But you put the columns together

    -- in 1 string for use for INSERTS and SELECTS.

    Then use a Data Flow task in SSIS to load the Staging Files table with the first column value determined by the name of the files. After that you can access the TableColumns table to generate your dynamic SQL for loading the various tables. Use a stored procedure in an Execute T-SQL task to do this, rather than a Data Flow. With Data Flows, you have to identify and know every single possible source and destination.

    If you know all your Sources and Destinations in advance, though, use SSIS and an expression to direct the flow of the package to the proper Data Flow task so you don't have to worry about all this T-SQL.

    EDIT: Sorry for the original incomplete post. Somehow hitting enter accidentally caused my post to post.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • FYI: If my above post is unclear, please let me know. I posted while juggling another task.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply