SQLServer 2005 SSIS Replication Database with customized Tables & Columns

  • Step 1. I'm not talking about using the transfer objects at all, Step 1 is a backup and restore, which can be be in an SSIS package.

    Step 4. when you copy the data back to DB1 the next time you restore you will get everything you have already pushed.. Is anybody else pushing data to this table? Is so you might have some id collisions.

    Step 5. Schedule the package for every 30 minutes. Don't have it running all the time.

    Step 6. is probably not needed, all the data but one table is coming from the source. You risk losing UP TO 30 minutes of data. This seems like a wasted step.

    Step 7. Again, not transfer backup restore.

    CEWII

  • Phil Parkin (5/4/2010)


    Do your backups and restores using Execute SQL tasks.

    After backing up on server 1 you'll need to transfer the file to server 2 before it is restored. I would probably use a script task to do the file transfer.

    You can restore over the top of another database - no need to delete it first.

    I would probably do the backup to a UNC path that both servers have access to.. I wouldn't transfer the file at all. There are some potential limitations to this depending on where these servers are located and the size odf the pipe between them.

    If you don't need to actually transfer the file I wouldn't.

    CEWII

  • Elliott W (5/4/2010)


    Step 1. I'm not talking about using the transfer objects at all, Step 1 is a backup and restore, which can be be in an SSIS package.

    Step 4. when you copy the data back to DB1 the next time you restore you will get everything you have already pushed.. Is anybody else pushing data to this table? Is so you might have some id collisions.

    Step 5. Schedule the package for every 30 minutes. Don't have it running all the time.

    Step 6. is probably not needed, all the data but one table is coming from the source. You risk losing UP TO 30 minutes of data. This seems like a wasted step.

    Step 7. Again, not transfer backup restore.

    CEWII

    Step1. what you mean by backup&restore and can be in SSIS package? u mean write the backup & restore script and put it into SSIS package? Or We could do it by using transfer SQL server objects task function? what's the different between them?

    Step4. If no one updates that table then fine, if so, since the ID is increasing by 1 and we can check the current latest ID of that table and compare with DB2 table's smallest ID, if former is greater than later one, do a loop insert from DB2 to DB1. is that a good solution?

    Step5. 30 mins interval only happens during the night or weekend, when pushing DB2's updateable table to DB1, and as long as there is no new update in DB2, I think it's safe.

    Step6. I can only push back updateable Table from DB2 to DB1, since DB1 is the main and original DB. So I could only update that table.

    Step7. Should be the same as Step1 if using restore & backup. But should consider the existing data, and have permission to overwrite old DB.

  • princa (5/4/2010)


    Elliott W (5/4/2010)


    Step 1. I'm not talking about using the transfer objects at all, Step 1 is a backup and restore, which can be be in an SSIS package.

    Step 4. when you copy the data back to DB1 the next time you restore you will get everything you have already pushed.. Is anybody else pushing data to this table? Is so you might have some id collisions.

    Step 5. Schedule the package for every 30 minutes. Don't have it running all the time.

    Step 6. is probably not needed, all the data but one table is coming from the source. You risk losing UP TO 30 minutes of data. This seems like a wasted step.

    Step 7. Again, not transfer backup restore.

    CEWII

    Step1. what you mean by backup&restore and can be in SSIS package? u mean write the backup & restore script and put it into SSIS package? Or We could do it by using transfer SQL server objects task function? what's the different between them?

    Step4. If no one updates that table then fine, if so, since the ID is increasing by 1 and we can check the current latest ID of that table and compare with DB2 table's smallest ID, if former is greater than later one, do a loop insert from DB2 to DB1. is that a good solution?

    Step5. 30 mins interval only happens during the night or weekend, when pushing DB2's updateable table to DB1, and as long as there is no new update in DB2, I think it's safe.

    Step6. I can only push back updateable Table from DB2 to DB1, since DB1 is the main and original DB. So I could only update that table.

    Step7. Should be the same as Step1 if using restore & backup. But should consider the existing data, and have permission to overwrite old DB.

    Step 1. Backup/restore and transfer objects are VERY different operations. A backup restore brings EVERYTHING about the database accross in its natural format. A transfer object effectively scripts out the objects and then applies them to the destination. Then copies the data, not sure if it is in the native format or if it does basically a SELECT *, but either way I don't view it as efficient, we aren't talking about a lot of data, even if it goes to several GB. You can issue a command like:

    BACKUP DATABASE SourceDataBase TO DISK = '\\fileservername\uncshare\SourceDatabase_Copy.bak' WITH STATS = 10 in an EXEC SQL task, this would write the backup to a fileserver that once it is done you could issue a restore command to restore it from that location onto your destination server.

    Step 4. I view a problem. If you have multiple writers then updates from your side are likely to be lost, the reason is that you both inserted a record with an ID of 10, you look and see that both sides have a 10, but that their side doesn't have 11 so you insert 11 there. Record 10 just got forgotten on your side. If you are the only writer to this table then your look and copy method is fine but if there are more than 1 writers you WILL have id collisions and records will be lost. You may need to add something to your side to keep track of your new records and renumber collisions so they don't get lost. It isn't pretty but it is workable.

    Step 5. Schedule the job anyway, it may not have anything to do but you know the table are always very current.

    Step 6. I may not have been clear.. There is really no reason to backup the database on your side, you have VERY little risk of data loss. If the database was corrupted you would lose up to 30 minutes of data and your side could be recovered by the normal source update process. What I'm saying is that you don't need to do step 6 at all.

    Step 7. Yes it is the same as #1. If you can do the restore you can do the overwrite. you don't need to consider existing data, make sure you run your periodic process before the restore and that it completes successfully. If it does then there should be no data in the local database that you even need to think about.

    CEWII

  • Great thanks Phil and Elliot!!

    I will run the test ASAP to see if there is anything I might just make the mistake, and will update the report of that. 😀

    Only one issue which I concern is the DB1's directory permission for me, I don't know whether I have enough permission to backup DB1 and use the backup file to restore DB2. If so, would be great, if not I think transfer sql server objects might be the second choose.

    Moreover, since for this task, I have several steps need to be implement, so I'd rather put all the operations in the SSIS package or all in the job schedule part, I don't want to separate part to SSMS part to SSIS. Do you think that's reasonable?

    Thanks.

  • As far as access, ask your network admin's, there is no good reason for that to be the issue.

    As far as in one place, that is fine, but I still see two SSIS packages, one for the backup/restores/scrubbing and another for the delta process.

    CEWII

Viewing 6 posts - 16 through 20 (of 20 total)

You must be logged in to reply to this topic. Login to reply