Restoring a subset of data to a test server

  • I'm running into a space issue on our test server. Our main sql server has plenty of hard drive space, but our test server only has about 250GB of space. Our database is about 240+ GB, so I quickly run out of space whenever I try to do anything on the test server. Is it possible for me to restore, say, 90% of the tables in their entirety to the test server but only restore a subset of data from the largest tables? Some of our largest tables contain sales data dating back to 2005 and I really only need anything from the current year. (I'm working on a partition plan, but I'd like to test on my test server and I don't have the space. Catch 22!).

    I'm using the Full Recovery Model. I can't pick and choose what tables to restore from a backup, correct? I would actually have to pull data out of my live sql server. Any advice on how to accomplish this?

    Thanks!

  • Under this situation, my first thought is to use the Import wizard to import all of the tables you need from the main database, then import the others via a query to control how much of it you get.

    It will be interesting to see other peoples suggestions here.

    Wayne
    Microsoft Certified Master: SQL Server 2008
    Author - SQL Server T-SQL Recipes


    If you can't explain to another person how the code that you're copying from the internet works, then DON'T USE IT on a production system! After all, you will be the one supporting it!
    Links:
    For better assistance in answering your questions
    Performance Problems
    Common date/time routines
    Understanding and Using APPLY Part 1 & Part 2

  • Using SQL Server's backup process, you can do what's called a "partial restore", which restores the primary filegroup and whatever secondary filegroups you choose. How useful that would be in your case depends on how the tables are distributed across seperate file groups. Other 3rd part backup tools provide more options for partial backups, including per table.

    You can use the Import and Export Wizzard to copy table schemas and/or data between servers. It would require some upfront mouse clicking and SQL coding to select all the appropriate tables and options, but once done you can save the task as an SSIS package, which can be re-run as many times as needed. Choosing the option "Write a query to specify the data to transfer" will allow you to specify a SQL select statement for each table, which is where you would restrict the data by date range.

    Also, you mentioned that the recovery model on your test server is Full. On our test servers, we typically leave it as Simple, because bulk loads are faster and less space is required by the database transaction logs. If disk space is critically low on the test server, and you're only working with a copy of production data, then Full recovery is working against you.

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

  • >>I can't pick and choose what tables to restore from a backup, correct?

    I believe third party tools can do this. For example with HyperBac you can run your test server directly off the backup. You can also issues selects directly against the backup file. And if my memory serves me you can selectively restore tables.

    It will probably be more cost efficient to purchase disk space than get a third party tool. Hopefully someone will have a 'native' solution for you.

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply