Issue with changing column datatype

  • I have a big Table with 100+Gb data. I need to change one column datatype from varchar 200 to varchar max. I was trying with Alter command but it is taking huge time. Is there is any workaround for this.

  • No sir,

    you are pretty much stuck with that solution, unless you want to create another table with the new structure and then do an insert into, but that will take a lot longer, since it has to duplicate all the data.

    My best advise is to do your analysis very well, because adding a varchar(max) to a table that large is not such a good idea.

    you must know that varchar(max) can hold up to 2GB worth of data, and since it is a variable data type, SQL will have allocate space to handle the data manipulation which takes a long time.

    [Varchar] is Variable-length, non-Unicode character data. max indicates that the maximum storage size is 2^31-1 bytes. The storage size is the actual length of data entered + 2 bytes. The data entered can be 0 characters in length.

    QUIGROUP- Need a Certified experienced DBA for a project or troubleshooting? Need help with programming, database recovery, performance tuning, ETL, SSRS or developing new databases? Contact us.. 1-786-273-9809

  • Check if there are any indexes containing this varchar column.

    If you change the data type of the column, indexes that contain that column also have to change.

    Therefore, save creation script for those indexes, drop them, change the data type of the column, and then create the indexes with "WITH ONLINE=ON" option. Test this scenario on a smaller copy of the table.

    100 GB table - It's a good candidate to consider partitioning.

    _____________________________________________________
    Microsoft Certified Master: SQL Server 2008
    XDetails Addin - for SQL Developers
    blog.sqlxdetails.com - Transaction log myths

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply