Making tables Readonly during a DTS run

  • Hello out there,

    I have 3 tables that I need to "lock" from updates while a DTS package runs.  This DTS can take up to 15 minutes to complete as it runs several data transformations that dump to .txt files that are then pushed out via ftp.

    Any update to any one of these 3 tables makes the .txt files inconsistent with one another and my FTP recipient will not process the file due to the inconsistencies.

    I've thought about writing a script to remove insert & update permissions for the role that updates these tables, but I'm not sure if I need to kill the users processes for the permissions change to take place?

    I've also looked up ways to lock the database, but I'm not sure of which lock I need, nor the way to implement it into this DTS.   It seems as though as soon as the statement is over the lock is removed.

    I'd love to hear about a simple command that I could place in the beginning to lock and end to unlock of my DTS workflow.

    Any ideas or suggestions are very much welcomed!

    Terrence

  • If you can finagle all the transformations into a single transaction, you can do this:

    SET TRANSACTION ISOLATION LEVEL SERIALIZABLE;

    BEGIN TRANSACTION

    -- do your transformations here

    ...

    COMMIT TRANSACTION

    There are some good articles on this site about transactions.

    There is no "i" in team, but idiot has two.
  • You could also write a script to KILL all connections to the database then put the database in a READ ONLY Mode.

    You could also put a TABLE LOCK on the tables you are using. If I had a better idea about how you are pulling the data perhaps I would be able to give a better idea about how to resolve the issue.

    You can find more information on TABLE LOCKS and SQL Server modes in SQL Servers Books Online.

  • Me being the GUI junkie that I am (and lacking the skillset) won't be able to run my transofrmations to .txt files using sql statements, so it would appear that locking the tables will be out of the question for me on this particular scenario.

    What I've decided to do instead is kill off the connections and put the specific tables into a readonly mode.  My code would then look as follows:

    /* Declare the temp tables for spid information */

    set nocount on

    create table #sp_who (

      spid  smallint,

      ecid  smallint,

      status nchar(30),

      loginame nchar(128),

      hostname nchar(128),

      blk  char(5),

      dbname nchar(128),

      cmd  nchar(16))

    create table #spids2kill (

      spid  smallint)

    insert into #sp_who execute sp_who

    insert into #spids2kill

       select spid

       from #sp_who

       where dbname = 'DB_2_Lock' and spid <> @@spid

    /* Remove change permissions on tables for the defined roles */

    revoke UPDATE ,  INSERT ,  DELETE on [dbo].[tbl_2_lock] to [Role]

    GO

    revoke UPDATE ,  INSERT ,  DELETE on [dbo].[tbl_2_locka]  TO [Role]

    GO

    revoke UPDATE ,  INSERT ,  DELETE on [dbo].[tbl_2_lockb]  TO [Role]

    GO

    declare @counter int

    declare @killit smallint

    declare @killstring varchar(20)

    set @counter = (select count(*) from #spids2kill)

    while @counter > 0

    begin

     set @counter = @counter - 1

     set @killit = (select top 1 spid from #spids2kill)

     select @killstring = 'Kill '+convert(varchar(4),@killit)

     exec (@killstring)

     delete from #spids2kill where spid = @killit

    end

    /* Separate script put perms back & cleanup after transformations take place */

    GRANT  SELECT ,  UPDATE ,  INSERT ,  DELETE  ON [dbo].[tbl_2_lock]  TO [Role]

    GO

    GRANT  SELECT ,  UPDATE ,  INSERT ,  DELETE  ON [dbo].[tbl_2_locka]  TO [Role]

    GO

    GRANT  SELECT ,  UPDATE ,  INSERT ,  DELETE  ON [dbo].[tbl_2_lockb]  TO [Role]

    GO

    /* Cleanup */

    drop table #sp_who

    drop table #spids2kill

    I'm thinking this will give me what I need.  Would anyone see any shortcomings or downfalls I should be aware of when doing this?

    Thank you!  -Terrence

  • Do you have a test enviorment that you could test it in?

    It should be OK as long as you are only reading from the table.

  • I do have a dev server that I gave this a go on and it worked fine.  The only problem is the application that updates these tables is written in VB6.  When the connections were killed the program would bomb out even when attempting a select on the tables.

    We finally settled on him doing a select against the table to return permissions and coding that if the tables were readonly to not perform the update.

    All I ended up doing was putting revoke commands in at the begginning of the DTS and grant commands at the end.  With my additions along with his code change, it appears that we've licked this problem.

    Thanks for the suggestions! --Terrence

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply