Transaction log

  • Hi Guru's

    I wonder if you could help me out with the following issue:

    We have a process that eats up all the disk space allocated for the transaction log file. I have tried to run the process by setting the recovery model to simple, but still it does the same thing.

    Is it possible to run the process but stop anything been written to the transaction log file. Off course I will do a full back up before this is done, so that we can recover in case of a problem.

    Cheers

    Tuan

  • Regardless of your recovery model, SQL Server has to use the transaction log to roll back any failed statements (or group of statements if bound up in a single transaction).

    Your process probably has a single statement (or transaction) that updates or deletes a large number of records. Try splitting this up, so that it does fewer updates or deletes at one time.

  • Regardless of the recovery model, as said in the comment before any activity done is logged. There are couple of ways you could that will decrease the number of enteries into the log:

    1. Set truncate log on check point on for that database.

    2. If you are doing lot of data loads, you can set bulk/copy insert on.

    3. Have the transaction log dump every (based on how fast the log gets full) and then truncate the log:

    sample script:

    /* you need to run this couple of times if you don't see the results immedietly */

    use <database name>

    go

    backup log <database name>

    to <backup device name >with init

    go

    create table t_dba_log_shrink(

    col1 int null,

    col2 char(8) null)

    go

    declare @var1 integer

    select @var1 = 1

    while @var1 < 20000

    begin

    insert into t_dba_log_shrink values

    (1,'12345678')

    select @var1 = @var1 + 1

    end

    go

    delete from t_dba_log_shrink

    go

    dbcc shrinkfile ('transaction log name', <size you want to reduce to ..say 50mb)

    go

    drop table t_dba_log_shrink

    go

    print ''

    go

  • You can use Bulk Recovery Model .

    Alo you can create alerts with job steps when you log becomes full , it can truncate it then shrink it .

    You must be executing long running queries , that's why the active\inactive portion of the log keeps on growing ....

  • See if you can break the large transaction up into mutiple smaller transactions.

    -Chad

    http://www.clrsoft.com

    Software built for the Common Language Runtime


    http://www.clrsoft.com
    Software built for the Common Language Runtime

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply