Using a field name in calculation

  • Hello, I have a table that looks like this. It has 18 columns of sales data.

    fields:

    fldProduct,fldOffset,[1],[2],[3],[4],[5],[6],[7],......[18]

    data:

    ProductA,-3,100,140,150,175,180,125,300,.......200

    The fldOffset contains a number like +2 or -3.

    I was previously doing this in a vb function using an array, but now I need to move it to a stored procedure for processing on the server.

    I need to take the data in column(x) and offset it by fldOffset and put it back in column (x + fldoffset). 

    Using the data above, the offset is -3, I would take the data in column [4] (175) and move it to column [1], column [5] to [2], [6] to [3] and so on. Everything in this record would move back 3 columns.  The next record may have a different offset say +2, and I would move everything forward or to the right 2 columns.

    Any assistance would be greatly appreciated!

  • Breaking relational data model and normakization rules is not smart.

    It does not simplify your life.

    Homogenious data must be in a single column. Period.

    And column names [1]...[18] must become numbers in a column next to column [Sales].

    Then you'll never have problems with offsets. And

    _____________
    Code for TallyGenerator

  • Hi,

    Yes this can be done.  Nice bit of complex t-sql!  I don't have time to knock any together now with it being 7pm on a Friday and the wife expecting me home, but it's certainly possible.

    You'd have to use some dynamic code and build the SQL code up and execute the SQL code.

  • I've gotta agree with Sergiy on this, it's not a good idea to do this kind of thing.

    However....

    In the above example, what's supposed to happen columns 1, 2 and 3 when the offset is -3 or to columns 16, 17 and 18 when the offset is +3?

    Are we just taking input x and then shifting the values of columns x through 18?

    Or is it just a one off overwrite of column (x + fldoffset) with the value of column x?

    (I'm just trying to think of the boundary rules so as not to work with nonexistent columns)

  • Yes its just replacing the values like

    Or is it just a one off overwrite of column (x + fldoffset) with the value of column x?

  • Select fldProduct,[1],[2],[3],[4],[5]

    where fldOffset = 0

    UNION ALL

    Select fldProduct,NULL,[1],[2],[3],[4]

    where fldOffset = 1

    UNION ALL

    Select fldProduct,[2],[3],[4],[5],NULL

    where fldOffset = -1

     

    and so on

    jg

    P.S. you should not store data this way.

     

  • Right, sorry on the road so doing this in notepad. Think you might have to cast or convert the variables to add them to the "set" clause of the update and it's probably riddled with syntax errors but... how's this?

    create proc MyProc @col_num int, @fldProduct nvarchar(25)

    as

    declare @offset int

    declare @SQL1 nvarchar(500)

    set @offset = (select fldOffset from MyTable where fldProduct = @fldProduct)

    set @SQL1 = 'update MyTable

    set [' + (@col_num + @offset) +'] = [' + @col_num + ']

    where fldProduct = ' + @fldProduct

    exec (@SQL1)

    Does that work at all?

  • Oh and forgot to mention...

    You should not store data this way. It really is a horrible way to do things. Still, I'm guessing your dealing with someone else's design sooo.... hypothetically speaking...

  • Thanks Martin and Jeff.  That helps alot.

    Yes, I did inherit the layout.  There is also 18 periods of history in the same record. (36 total).   I wondered, however, if it is better to keep this design or increase the number of records in this table 36 times (1 for each period) .  So if the average table has 8,000 records and I 'normalize', I'll have 288,000 records?  I'm not sure what is best. Any suggestions?

  • You'll find that many folks here have hundreds of millions or even billions of rows of data, so don't fret expanding your's to 288k. The main problem with the design as it stands was amply demonstrated in this thread. I'm betting that you wouldn't have needed assistance to accomplish your task if the data was normalized. Not only does it make it easier to work with, most of the time it is actually more efficient than parsing strings and jumping through other similar hoops, even though it greatly increases the number of rows you are dealing with.

    For the record, many of us are very understanding of inherited data models, especially the consultants out there. Most consultants have to work with what exists, and while they might make recommendations, don't often have the power to actually enforce them.

  • Thanks for the info.  Your right on about the last part. You play the hand your delt, (within the scope, budget and timeframe of the project of course!)

    Thanks again

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply