Need to move 65GB of data to an archive database

  • I have an issue that I am struggling to figure out the best way to handle. I have a managed server that has SQLServer 2005 Workgroup Edition installed. The server has 2 250GB partitions. However, I attempted to copy data from one database to a new database that would become the "archive" database via an insert from select. This was taking forever, and eventually I killed the query with the KILL command because the available space on the C: drive went as low as 20GB. On the D: I have an 85GB full backup that triggered my foray into database administration with about 12 GB of free space on the remaining on that drive.

    Regardless, one of the four tables I am trying to move to the new archive database is 65GB (according to SSMS), and the other 3 basically amount to 12GB. I hope to have about 108GB of free space to do this (need to undo the affects from my first attempt), but I don't know the best way to go about moving the data.

    Eventually, once the old data is moved off of the original database I plan to delete the data. Once the old data is wiped off I will take a full backup of the original database that will hopefully be much much smaller than 85GB. I also plan to take a full backup of the archive database and store that somewhere safe and off the server. All in all I am just trying to rearrange the data for now. I know I have issues, but I will have to wait to address those after I have a reliable backup established.

  • I would say you want to take the "how do you eat an elephant" approach: one bite at a time.

    1. Make sure the destination database is set to SIMPLE recovery mode. you can back it up or change the recovery mode to FULL later, after the data is transferred.

    2. Move the data in smaller pieces instead of all or nothing;

    the all or nothing approach is bloating the logs with undo information in case of a rollback.

    Instead, move 10K rows at a time or so, as long as you can identify which rows are not copied yet. then you could interrupt it if you needed to, and have it restart again later with any issues;

    for example, assume there was an IDENTITY() column or some other PK that is the unique identifier of the row.

    you could use something like this:

    SET ROWCOUNT 10000

    WHILE 1=1

    BEGIN

    DECLARE @NewID int

    SELECT @NewID = max(ID) From Archived.dbo.MyTable

    IF @NewID IS NULL SET @NewID = 0

    INSERT INTO Archived.dbo.MyTable (ID,ColimnList)

    SELECT ID,ColimnList FROM Production.dbo.MyTable WHERe ID > @NewID ORDER BY ID

    IF @@ROWCOUNT = 0

    BREAK

    END

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • That makes sense. I will attempt that next. After I figure out getting my space back.

    I know I probably should have posted this the the newbie forum. Panic got the best of me this time.

  • Ugh! The best I could do was get back 23GB on my C: drive bringing the free space up to 43GB. Before I tried to transfer the data in one giant move I had 108GB available which would have been ample space to copy over the data to the new archive database, verify that it matches the table in the source database, and then delete the data from the source database. Now I am convinced I will have to break this down in chunks so that I don't run into space constraints, but I should have done that from the start.

    Regardless, I looked at my transaction log for the source database. It is weighing in at 19,972 MB, and 99% of it is available. I am convinced that this is because I tried to do such a large transfer. I am also convinced that my attempt blew up my initial size of the file as well. However I can't shrink this file. I imagine that this is because the initial size is 19,973. I have tried to change this value but any changes don't seem to be effective. I have read that this value cannot be changed and that the only way to make it smaller is to recreate the database which is something I don't want to do just yet (although I have a feeling that this will be needed eventually to start with a clean slate). Also I deleted the archive database that I had created yesterday just in case something with that was larger than it should be.

    That said, I am not really sure the best route. I am at the point where I am going to move the data slowly so that I don't lose anything. Any suggestions would be greatly appreciated. Not sure where else to look for the missing 40GB.

  • On a whim I decided to back up the transaction log. That did the trick. Now the initial size of the transaction log file is 13MB and my free space is up to 63GB. Still trying to figure out how to find the missing 40GB though. I could probably do without it, but at this point the more free space I can muster the more comfortable I'll feel.

  • WOO! found the last 50GB or so.

    So I discovered that my tempdb was at 60 or GB. I think I tried to shrink the log file on this yesterday but it had no effect on the size. Today, discovered DBCC FREESYSTEMCACHE('ALL') from this ms support doc. I ran that command and then shrunk the tempdb, and I was able to recover 60GB. I am now at 120GB available on my C:.

    Anyway, two very important lessons here (among others).

    1. Always ask questions, and don't be content with answers that you know might not be good. In my case that was that we were relying on backups by our server provider, but these were of the entire server not the database. I was fine with this until the database fell into my lap. You can see why I am disappointed in myself for not asking why we were satisfied with this.

    2. Try to cut large transactions into chunks. I am sure that there are ways to get around filling up the transactions log. I am now curious if the bulk-logged model could have helped.

    Regardless, I know I have a long way to go in securing this database, and I know I did things in a less than ideal fashion. However, I needed to start somewhere.

  • Can you do BCP, where you can save time and space?

  • Probably.

    The script that Lowell provided was great, but it is taking a long time to complete. Actually, it is still running. Regardless, I just followed his approach because it was similar to the approach that I was going to use (which failed horribly). BCP even crossed my mind at one point yesterday, but I decided to just let this run its course. That said, I doubt I will have to transfer this much data again for the foreseeable future. However, if I do I will try out BCP.

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply