shrink huge datafile

  • after a huge data purging i've to shrink 1 datafile from 2tb to 100gb, but in this file there is an always accessed table

    (insert new records mostly).

    which is the best approach to perform this shrink without causing locks on table? i think about an step by step shrink

    (eg from 2tb to 1,8tb ecc..)

    what do u think?

  • The table(s) you purged all the data from, wouldn't it help if you dropped and recreated the clustered index(es)?

    That way you are resizing the actual tables you purged and the other tables being accessed wont be affected as opposed to doing a database shrink, which would require you to reorganize or rebuild your indexes anyway?

    I've never tried it, but would be interested in what others think.

    hmmm.. although, it still wont resize your data file... will be interested to hear

  • there is only that table in this DF,

    btw i've done the job

    resize step by step without causing locks

    I do all the maintenance tasks (rebuild nonclustered indexes and reorg pk (can't rebuild offline), and update table stats) after the shrink, seems it works well

  • Was this on purge some form of audit table?

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply