What will be the best way to send the production data to a separate server on daily basis?

  • Hi,

    I have a requirement from the developer's team that they want the Production data to a separate server so that they can use that for Reporting purpose. We have 4 databases in production and they want all 4 databases to be refreshed to the Reporting Server every day night. Out of 4 databases one database size is 90 GB.

    What will be the best way to send the production data to a separate server on daily basis?

  • 1) transactional replication

    2) backup/restore

    3) "handmade" tool

  • depending on load on the production server you can use any option from high availability:-

    replication

    logshipping

    mirroring

    But you may have to factor in the pros and cons of each of the method you'll adopt. i have setup logshipping for the similar requirement as yours with STANDBY READ ONLY option on secondary server. As it will be only used for reporting purposes, it will solve the purpose. But factor in your daily DB Maintenance tasks as well.

  • Hi,

    Currently we are following the below procedure:

    1.Backup job to backup 2 databases with regular backup script.

    Backup database database name

    to disk = Z:\Backup

    2.Scheduler job(from control panel) to copy the .bak files to secondary server

    3.Restore job to restore the databases in Secondary.

    This solution working fine & provided the databases to developers to query data for reporting.

    But the Problem is:

    I'm running daily backup job (other than the backup job in step1 above), which will create backup file as DBNAME_mmddyyyy.BAK

    and in Scheduler task, I'm calling copy.cmd file

    In copy.cmd, I have the below:

    DEL \\Secondary_Server\DBRefresh\Mydb.BAK

    copy Z:\Backup\Mydb.BAK \\Secondary_Server\DBRefresh\Mydb.BAK

    So, I need to run the Full backup job twice daily. But I want avoid this.

    I did not get a clue to automatically copy the bakfiles (DBNAME_mmddyyyy.BAK) via scheduler task job instead of dbname.BAK

    please advice

  • rambilla4 (10/1/2009)


    Hi,

    I have a requirement from the developer's team that they want the Production data to a separate server so that they can use that for Reporting purpose. We have 4 databases in production and they want all 4 databases to be refreshed to the Reporting Server every day night. Out of 4 databases one database size is 90 GB.

    What will be the best way to send the production data to a separate server on daily basis?

    I think snapshot replication is what you need. Schedule it to occur at every night after the days work...

    Regards,
    [font="Verdana"]Sqlfrenzy[/font]

  • Then rename that daily backup file from DBNAME_mmddyyyy.BAK to mydb.bak prior to copy step using ren command.

    MJ

  • I think snapshot replication is what you need. Schedule it to occur at every night after the days work...

    Using Snapshot replication to move the Large Database of 90GB would cause high Network throughput on those Servers.

    Instead using a Transactional replication of the Large tables and snapshot replication on small to medium table would be ok, which I did in a similar scenario.

    Also if moving backups and restoring is a solution that you have been implementing, why not improvise it by taking weekly Full backups and move it to the secondary server and then use daily differential and move that to the secondary server and apply restoration using the complete set (Full Bkup + All Daily Diff Bkups).

    This would reduce the need to perform daily full back on the Prod Box and you need not move the Full Backup (90 GB) every night.


    Bru Medishetty

    Blog -- LearnSQLWithBru

    Join on Facebook Page Facebook.comLearnSQLWithBru

    Twitter -- BruMedishetty

  • bruhaspathy (10/17/2009)


    I think snapshot replication is what you need. Schedule it to occur at every night after the days work...

    Using Snapshot replication to move the Large Database of 90GB would cause high Network throughput on those Servers.

    Instead using a Transactional replication of the Large tables and snapshot replication on small to medium table would be ok, which I did in a similar scenario.

    I suggested the snapshot replication because they only need days data to work with also...transactional replication demands monitoring and is difficult to manage than snapshot; also if you have systems in a LAN then I don't think it will cause much issue considering the fact that the complete process will be done in off peak hours.

    I would though say that mixed approach as suggested by you is good if the person is willing to monitor the thing.

    Regards,
    [font="Verdana"]Sqlfrenzy[/font]

  • We do want you want all the time:

    1. Create a Backup Device for each database (Use SSMS - Server Objects - Backup Devices - right click to create a new device)

    2. Create a SQL Server Agent Job to Backup the database to the Backup Device.

    3. Add another job step to run an XCopy command to copy the Backup Device from Prod to the DEV server.

    XCopy /Y \\YourProdServer\D$\Backups\BackupDeviceName.bak \\YourDevServer\D$\Backups

    4. Create a job on the Dev server with a job step to run a restore script and schedule it to run.

  • Our production database is 266400.69 MB and a simple backup, copy to test server, restore on test works fine for us as a nightly job.

  • Our production database is 266400.69 MB and a simple backup, copy to test server, restore on test works fine for us as a nightly job

    Can you tell me how your are copying the .bak file to a secondary server? I want to know how did you automate that process?

    I'm assuming your .bak file have the format dbname_mm-dd-yyyy.bak.

    I have the regular backup job(takes backup of all databases including the databases required to send to a Secondary server) which gives the date format for .bak file as dbname_mm-dd-yyyy.bak. In this case I'm not able to automate the copy job from schedule task job. So that's why I created a one more backup job which gives the NO date format i.e simply dbname.bak and I automated this working fine. BUT I need to perform the backup twice which I want to avoid by automating the copy job to send .bak files with date format dbname_mm-dd-yyyy.bak.

    please advice

    thanks

  • I assume you want the backup with the date appended so you can keep a couple of days worth of backups. Why don't you backup to dbname.bak and then add a job step to rename that file to dbname_mm-dd-yyyy.bak?

  • I lied we used to copy to the test server. We hace a nice fast SAN now!

    This is the code from our restore to test server job. This executes nightly on the test server.

    DECLARE @CmdString VARCHAR(2000)

    CREATE TABLE #tmpFiles

    (FileName varchar(150))

    CREATE TABLE #tmpFiles2

    (FileName varchar(150), FN VARCHAR(100))

    -- You need to turn on xp_cmdshell usage on the test server.

    INSERT INTO #tmpFiles

    -- Substitute where ever you stash your *.bak files.

    EXEC master.dbo.xp_cmdshell 'dir "\\PCORPRPSQL\g$\MSSQL\Backup\User Databases\*.bak"'

    --select *

    --from #tmpFiles

    DELETE

    FROM #tmpFiles

    WHERE Filename NOT LIKE '%your_dbFile_Name%' --

    OR Filename IS NULL

    --select *

    --from #tmpFiles

    INSERT #tmpFiles2

    SELECT filename,REVERSE( SUBSTRING( REVERSE(Filename), 0, CHARINDEX(' ', REVERSE(Filename) ) ) )

    FROM #tmpFiles

    --select *

    --from #tmpFiles2

    -- Note this next line wrapped. The max(fn) gets the top record from the #tmpFiles2. To process more than one

    -- you would want to then delete max(fn) from the table and run the select again there by getting the

    SELECT @CmdString = 'RESTORE DATABASE [<your_TesTdbFile_Name>] FROM DISK = N''\\PCORPRPSQL\g$\MSSQL\Backup\User Databases\PestPac\' + max(fn) + ''' WITH FILE = 1, NOUNLOAD, REPLACE, STATS = 10'

    from #tmpFiles2

    print @CmdString

    --EXEC (@CmdString)

    -- delete max(fn) here then do the select and exec pair of statements again.

    DROP TABLE #tmpFiles

    DROP TABLE #tmpFiles2

  • Wallace Wood (10/21/2009)


    Our production database is 266400.69 MB and a simple backup, copy to test server, restore on test works fine for us as a nightly job.

    Just to point out that the backup/restore method carries the additional benefit of giving you a 'free' way of testing the integrity of your database backups, which is something that you should be doing on a regular basis.

    Chris

  • Good point! I never thought of it that way. We just do it like this because were lazy. 🙂

Viewing 15 posts - 1 through 15 (of 15 total)

You must be logged in to reply to this topic. Login to reply