• We are doing something similar, but not because of the primary keys though...Anyway, what we did was create triggers on the base tables for insert/update/delete that will insert a record into a "queue" table. You must have some sort of identifiers on the base tables that you can use to determine which rows were changed (unique or almost unique). We are inserting the "Key" field, a table ID (what table was updated), and a timestamp into the "queue" table. Also the queue table has an identity column. We then have another table that lists all servers that should receive the changes. Each server record has a entry in the server table keeps track of the last ID (Identity value) that it copied across. We then have a procedure that runs in a loop at the destination server/DB that checks the control table for changes every X seconds and delete/inserts those rows into the local tables. We are using linked servers in the copy-proc. You could use bcp-out, bcp-in, insert, but that's extra work; linked servers are much easier.

    We also have a master proc that calls the copy procs on each "subscriber". The master proc runs as a SQL Agent job on the "Publisher", and is configured to start when the server starts.

    Let me know if you need any addition info...good luck.

    -Dan


    -Dan