Log shipping(zip,unzip,restore)

  • Hi

    We are planning to setup log shipping model being setup between two sqlserver 2005 enterprise edition.

    Our transaction log backup sizes are not consistent through out the day.The following is the scenario.

    T1 at 8.00 -- 315MB

    T2 at 8.30 -- 152MB

    T3 at 9.00 -- 2.5GB

    T4 at 10.00 -- 500MB

    The bandwidth available is 25MB/minute. We are taking the backups of Tlogs at 30mins interval.Now at 'T3' log transfer we want to zip the file on the primary server,copy the file over the WAN and unzip it standby server.

    Now i wanted to know whether to perform this kind of action(Zip,copy,unzip), can i create a separate job on primary ?? which will put the unzipped file on standby and allow the log shipping job to restore the log file (T3)

    Also i wish to carry my T3,T4 etc backup -- copy-- restores through my Log shipping activity.

    I cannot take 15mins backup on primary server to reduce the transaction log backup size.Thts another constraint i have.

    Also tell me how to zip and unzip thru command line if u hve any link.

    Please advice me

    thnks for the help in advance

    Regards

    Arvind L

  • Dear Arvind,

    Log shipping does not allow you to zip or unzip the logs as it is a completely automated process. Please have a look at msdn for more info on logshipping. And your another question you want to zip the files for this you can do by using xp_cmdshell and makecab (dos function). Syntax :-

    makecab /L folderwherecabfileshouldbeplaced sourcefile cabfilename.cab

    Ex:

    xp_cmdshell 'makecab /L c:\mybackupcabs c:\mydb mydb_06022008',no_output

    Regards

    Chandra Mohan

    [font="Verdana"]Thanks
    Chandra Mohan[/font]

  • chandra mohan,

    Thanks u vry much for the reply .

    Yes log shipping is an automated process,and i know it zipping is nt a part of log shipping activity.But i ws wondering whether we can create an excluse job to zip the file and transfer and unzip only for TLogs which are bigger in size and again resume over normal logshipping jobs for Tlogs of lesser sizes.

    Thank you very much for the command u gave for zipping the file.Again whts the command for Unzip the file.

    One more thing do you hve a customised log shipping script on sqlserver 2005.?

    Regards

    Arvind L

  • Dear Arvind,

    You can use the extract.exe utility which comes with windowsxp or windows support tools or you can use any commandline zipping utility( ex:pkzip). And I dont have the customized script for logshipping. I have a solution to your problem. Please make sure you test the following solution on a test server and dont blame me if anythings goes wrong. Study carefully before implementing. If you configure logshipping through the management studio you will get four jobs generated.

    1. For alert.

    2. for backup

    3. for copying the logs

    4. for restoring.

    After configuring just disable the 3rd job and write your own script.

    Step 1. Compress the backup file. If you are using the makecab it will automatically copy to the location which you specify by using /L parameter. (both birds at one shot)

    Step 2. if you get the extraction utility just use it using the xp_cmdshell and extract it to the location required.

    But be careful and study the time taken for each step otherwise you will be having a lot of logfiles to get restored and you will be in trouble.

    Regards

    Mohan

    [font="Verdana"]Thanks
    Chandra Mohan[/font]

  • Dear Arvind,

    for extracting the cab file. Plz follow the link (Extracting the Contents of a CAB File)

    http://support.microsoft.com/kb/176810

    Regards

    Chandra Mohan

    [font="Verdana"]Thanks
    Chandra Mohan[/font]

  • Thanks for the wonderful idea chandra mohan,

    Ys thts a grt idea to disable the copy job and create a custom job for copyin the zipped file.But tell me,if we r scheduling a custom job which is not a part of log shipping activity,then how are we going to populate the predined system tables in logshipping.I mean,we have to even populate the defined system tables in log shipping process too right for tracking the backup time,copy job time,restore job time.? will it nt create any problem.

    I didnt try this before..plz give me some more info wht all the system tables to be updated when we r using an exclusive custom job like copy in our case.

    Also tell me, can we still carry our logshipping backup activity with the monitor set threshold values without any errors,custom job inbetween,restore activity with anothr monitor set threshold values without any errors?

    Thanks again for ur help

    Regards

    Arvind L

  • Dear Arvind,

    Instead of that you can try this one. But you will have to modify two jobs

    1. Backup Job

    Once the backup job gets completed execute the following steps

    makecab mydb_200802071230 mydb_200802071230_cab ( which will compress the transaction log to a cab file. dont put .cab extension)

    ren mydb_200802071230 mydb_200802071230_orig (rename the uncompressed log file)

    ren mydb_200802071230_cab mydb_200802071230 (rename the compressed log file to original name)

    At this point whenever the copy job starts executing it will copy the compressed cab file thinking that its the uncompressed log file.

    2. copy file

    Once the copying gets completed. extract the log file from the compressed cab file to original name. Now the restore job starts and will restore the uncompressed log file.

    These steps will reduce a lot of burden from us. Otherwise as you had told we have check out the system tables and we have to write to them.

    Regards

    Chandra Mohan.

    [font="Verdana"]Thanks
    Chandra Mohan[/font]

  • hi chandra mohan,

    Wow ..Thts a great idea man...we can zip and rename the original file..I think this should work.I will give it a try.

    Please tell me how to catch the latest backup file on a folder on primary and compress it?

    i know we hve to schedule a job with xp_cmdshell .But wht should be the logic??.

    Im planning to setup the log backups at 30mins interval.i think the transaction log file in log shipping process will be like this

    'dbname_tlog_yyyymmddhhmm.trn'.

    How to incorporate this logic in makeCAB command..?

    Pleae help

    regards

    Arvind

  • hi

    can u do it for me today moham if u have time...

    Regards

    Arvind

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply