New Database Build

  • Good morning all.

    Has anyone been able to get an alert to firer off when a new database is added to SQL Server 2005. I had it working in 2000 and 7.0 but there have been some changes so it is not working in 2005. 

    There is another DBA in the shop that does not seem to want to build backups or notify anyone when he builds a new database!

     

    Thanks for you comments and suggestions.


    Stacey W. A. Gregerson

  • Hi,

    I found a workaround. I create maintenance plans to backup all databases or all user databases (whatever is appropriate). Then if a new database is added, it would be backed up.

    Regards,Yelena Varsha

  • Thank you for your reply.  Each database is different and requires a different type of maintenance. So I create backup plans and tuning checks based on each database usage.

     

    Thanks though for your suggestion.

     


    Stacey W. A. Gregerson

  • Then I would keep track of the number of databases in a table. Then run a job once a day that compares the result from select count(*) from sysdatabases to the previous value.

    I myself run a job every day that browses my servers and gets a day of last backup for all databases and sends me an email. So I know if databases were not backed up  and how many records are in the relsult set. If the number of rows changes, then I know that somebody created a database on one of the servers, so I actually read the whole list. If the number is the same, then I check if the number of days since last backup are all 0 or 1.

    I run this job using SQL Server 2005, database mail, OPENROWSET statement, a list of servers in a table, compose dynamic strings and query MSDB database. I do have to have a login on each server that has read-only access to MSDB.

     

    Regards,Yelena Varsha

  • Now there is an approach i had not thought of.  I like it.  I will look at it and see what it will take to get setup.

     

    Thanks


    Stacey W. A. Gregerson

  • Should be able to use a DDL trigger.  Straight from BOL, you would have to come up with a schema for notificatino, but should not be too hard to add a sp_send_Dbmail statement to this.

    CREATE TRIGGER ddl_trig_database

    ON ALL SERVER

    FOR CREATE_DATABASE

    AS

        PRINT 'Database Created.'

        SELECT EVENTDATA().value('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]','nvarchar(max)')

    GO

    DROP TRIGGER ddl_trig_database

    ON ALL SERVER

    GO

     

  • It depends on what you want to do with the alert.  If you're auditing for information purposes then Event Notifications are best suited for this purpose.

    If you want to say, set some rules, like...a table doesn't have a primary key, then roll it back and send an error stating the reason their script failed or what not, then you'd use triggers.

    We've built out a very elaborate auditing system using the service broker and triggers.

    Here's how you would get a notification on when a database was created:

    First, I create a database that will house the service queues for the service broker:

    USE

    master

    IF

    DB_ID('AUDITEVENTS') IS NOT NULL

    DROP

    DATABASE AUDITEVENTS

    go

    CREATE

    DATABASE AUDITEVENTS

    go

    USE

    AUDITEVENTS

    go

    Next I set the database as trustworthy, because let's say I wish to send these events to a centralized database on another server for further analysis or maybe that's where I keep all the audit information and I set the DB owner to SA, just in case it gets set to somebody with less privileges for distributed transactions...again, you only need to do this if you plan on shipping your data to another database when the event fires.

    ALTER DATABASE AUDITEVENTS SET TRUSTWORTHY ON;

    GO

    ALTER

    AUTHORIZATION ON DATABASE::AUDITEVENTS TO [sa];

    GO

    If you wish to make it real easy for your server to ship events off to another box...use a Synonym for the updates, for example:

    --CREATE SYNONYM AuditDDLEvents FOR MyDBAServer.DBAAudit.dbo.AuditDDLEvents
    --GO

    Now we create a Queue that will manage the events that get fired.  This queue keeps all messages until they can be delivered.  Keep in mind you can "fake" delivery for messages you don't want to process, there by further refining the events you wish to store.  You shouldn't need to do this if you have a clear idea on what you're wanting.

    Here's how to create the Queue:

    CREATE

    QUEUE AuditDDLQ WITH STATUS = ON

    go

    Now we create the service that will route messages to the queue we just created.

    CREATE

    SERVICE AuditService

    ON

    QUEUE AuditDDLQ

    -- THE CONTRACT IS BUILT IN

    (

    [http://schemas.microsoft.com/SQL/Notifications/PostEventNotification])

    go

    I hope this is self-explanatory, the only thing that isn't is the contract.  It must be defined and since we're wanting to be notified on events within SQL it already has the contract built-in.  You would need to pay attention to this in much greater detail if you used the Service Broker for custom messages and what not.

    Then we create an Event Notification for what we want to catch.

    CREATE

    EVENT NOTIFICATION AuditDDL

    ON

    SERVER -- SCOPE

    FOR

    DDL_DATABASE_LEVEL_EVENTS -- EVENT_GROUP

    TO

    SERVICE 'AuditService','current database' -- TARGET SERVICE

    go
     
    For my example, I've chosen to do all DDL Database events...which includes create database, it also includes drop tables, create tables, etc.
     
    To test your queue:

    -- READ QUEUE

    SELECT

    * FROM AuditDDLQ

     

    -- TEST

    CREATE

    TABLE FOO(BAR int)

     

    -- READ QUEUE AGAIN

    SELECT

    * FROM AuditDDLQ

     

    -- MAKE SOME SENSE OUT OF MESSAGE BODY

    SELECT CAST(message_body AS xml) FROM AuditDDLQ
     
    Now...here's where it gets fun...you have to do something with those messages or they stay there...and you don't want them to stay there...you want to do something with them...
     
    You would create a stored procedure to read information from the queue and do something with the events that get fired...
     

    CREATE

    PROCEDURE dbo.ReadAuditDDLQ

    AS
    declare

    @message_type nvarchar(128);

    declare

    @message varbinary(max);

    DECLARE

    @eventdata AS XML;

    declare

    @messageText AS nvarchar(max);

    WHILE

    (1=1)

    BEGIN

    BEGIN TRANSACTION

    -- Receive the next available message from the queue

    WAITFOR (

    RECEIVE top(1)

    @message_type

    = message_type_name,

    @message

    =message_body FROM AuditDDLQ

    ), TIMEOUT 1000

    IF (@@ROWCOUNT = 0)

    BEGIN

    ROLLBACK TRANSACTION

    BREAK

    END

    -- For testing get everything in queue

    SET @messageText = @message;

    SET @EventData = (SELECT CAST(ISNULL(@messageText,N'<null/>') as xml));

    --Currently we don't care about the CREATE_STATISTICS event.

    IF (SELECT CAST(@eventdata.query('data(//EventType)') AS sysname)) = 'CREATE_STATISTICS'

    BEGIN

    --DELETE FROM AuditDDLQ WHERE @message = message_body;

    -- You shouldn't need to uncomment the above line...committing the transaction should make the event go poof.

    COMMIT TRANSACTION;

    BREAK;

    END

    ELSE

    --The below table obviously needs to be created before inserting data into it.

    INSERT INTO AuditDDLEvents(

    servername

    , databasename, posttime, eventtype, loginname, schemaname,

    objectname

    , targetobjectname, message_body)

    VALUES (

    CAST(@eventdata.query('data(//ServerName)') AS sysname),

    CAST(@eventdata.query('data(//DatabaseName)') AS sysname),

    CAST(@eventdata.query('data(//PostTime)') AS VARCHAR(23)),

    CAST(@eventdata.query('data(//EventType)') AS sysname),

    CAST(@eventdata.query('data(//LoginName)') AS sysname),

    CAST(@eventdata.query('data(//SchemaName)') AS sysname),

    CAST(@eventdata.query('data(//ObjectName)') AS sysname),

    CAST(@eventdata.query('data(//TargetObjectName)') AS sysname),

    @message

    );

    COMMIT TRANSACTION

    END

    go

     
     
    In case you're interested, here's the command to create the AuditDDLEvents Table:
     

    IF

    OBJECT_ID('dbo.AuditDDLEvents') IS NULL

    CREATE

    TABLE dbo.AuditDDLEvents

    (

    lsn

    INT NOT NULL IDENTITY,

    servername

    sysname NOT NULL,

    DatabaseName

    sysname NOT NULL,

    posttime

    varchar(23) NOT NULL,

    eventtype

    sysname NOT NULL,

    loginname

    sysname NOT NULL,

    schemaname

    sysname NOT NULL,

    objectname

    sysname NOT NULL,

    targetobjectname

    sysname NOT NULL,

    message_body

    varbinary(max) NOT NULL,

    CONSTRAINT PK_AuditDDLEvents PRIMARY KEY(lsn)

    );

    GO
     
    Anyway, now we have to assign the stored procedure to the service so that it auto-fires for us.
     

    --ACTIVATE SERVICE PROCEDURE, 1 READER

    ALTER

    QUEUE AuditDDLQ WITH STATUS = ON,

    ACTIVATION

    (STATUS=ON,PROCEDURE_NAME = dbo.ReadAuditDDLQ,

    MAX_QUEUE_READERS

    =1,EXECUTE AS SELF)

    go

     
    Now we check our AuditDDLQ:
     

    SELECT

    * FROM AuditDDLQ

    We see it's empty, now check our table:

    SELECT

    * FROM AuditDDLEvents

    And there you go...you're auditing all the DDL events.  If you wish to narrow it down to specific events then what you would do is change the service to fire on specific events instead of a blanket category.  You can find the categories in Books Online.

    -- TEST AGAIN

    DROP

    TABLE FOO

     
     
    Once you're finished playing around you can clean it all up if you'd like, just uncomment the code below:

    /*
    -- CLEANUP
    use AuditEvents
    go
    DROP EVENT NOTIFICATION AuditDDL ON SERVER
    go
    USE master
    go
    ALTER DATABASE AUDITEVENTS SET SINGLE_USER WITH ROLLBACK IMMEDIATE
    DROP DATABASE AUDITEVENTS
    */

     

    Christopher Ford

  • Forgot to mention the reason the message_body is stored as varbinary in the table...it's because you cannot send XML via a linked server transaction.  so if you're not using a linked server to offload the data, then you'd probly want this version:

    CREATE PROCEDURE dbo.ReadAuditDDLQ

    AS
    DECLARE

     @message_type nvarchar(128);

    DECLARE @message AS nvarchar(max);

    DECLARE @eventdata AS XML;
     

    WHILE

    (1=1)

    BEGIN

    BEGIN TRANSACTION

    -- Receive the next available message from the queue

    WAITFOR (

    RECEIVE top(1)

    @message_type

    = message_type_name,

    @message

    =message_body FROM AuditDDLQ

    ), TIMEOUT 1000

    IF (@@ROWCOUNT = 0)

    BEGIN

    ROLLBACK TRANSACTION

    BREAK

    END

    -- For testing get everything in queue

    SET @EventData = (SELECT CAST(ISNULL(@message,N'<null/>') as xml));

    --Currently we don't care about the CREATE_STATISTICS event.

    IF (SELECT CAST(@eventdata.query('data(//EventType)') AS sysname)) = 'CREATE_STATISTICS'

    BEGIN

    --DELETE FROM AuditDDLQ WHERE @message = message_body;

    -- You shouldn't need to uncomment the above line...committing the transaction should make the event go poof.

    COMMIT TRANSACTION;

    BREAK;

    END

    ELSE

    --The below table obviously needs to be created before inserting data into it.

    INSERT INTO AuditDDLEvents(

    servername

    , databasename, posttime, eventtype, loginname, schemaname,

    objectname

    , targetobjectname, message_body)

    VALUES (

    CAST(@eventdata.query('data(//ServerName)') AS sysname),

    CAST(@eventdata.query('data(//DatabaseName)') AS sysname),

    CAST(@eventdata.query('data(//PostTime)') AS VARCHAR(23)),

    CAST(@eventdata.query('data(//EventType)') AS sysname),

    CAST(@eventdata.query('data(//LoginName)') AS sysname),

    CAST(@eventdata.query('data(//SchemaName)') AS sysname),

    CAST(@eventdata.query('data(//ObjectName)') AS sysname),

    CAST(@eventdata.query('data(//TargetObjectName)') AS sysname),

    @message

    );

    COMMIT TRANSACTION

    END

    go
    --------------------------
    And the table would be:

    --------------------------

    IF

    OBJECT_ID('dbo.AuditDDLEvents') IS NULL

    CREATE

    TABLE dbo.AuditDDLEvents

    (

    lsn

    INT NOT NULL IDENTITY,

    servername

    sysname NOT NULL,

    DatabaseName

    sysname NOT NULL,

    posttime

    varchar(23) NOT NULL,

    eventtype

    sysname NOT NULL,

    loginname

    sysname NOT NULL,

    schemaname

    sysname NOT NULL,

    objectname

    sysname NOT NULL,

    targetobjectname

    sysname NOT NULL,

    message_body

    xml NOT NULL,

    CONSTRAINT

    PK_AuditDDLEvents PRIMARY KEY(lsn)

    );

    GO

     

    Christopher Ford

  • Sorry...LAST post about this...

    You will want to apply SP2a to your SQL 2005 boxes...because there's a bug in the service broker.  If you monitor connections with MOM or another admin tool it will go crazy about the connections made...because the bug in service broker is that it increments the connection perfmon counter, but doesn't decrement it when done...so...with 700 users hitting your system making updates it quickly jumps to the many thousands...all kinds of alarms go off...people get angry...it's not pretty...

    The connections don't exist either...leaves everyone scratching their head as to why something like MOM says there's 45,000 connections to the database and only 50 users actually connected.  Yes...a few late nights with that one...let me save you the trouble and let you know ahead of time...install SP2.

    Christopher Ford

  • Thanks Christopher,

    This is a lot for me to review. I already have SP2 installed. We do not use MOM (Yet, I am hopfull), we use Patrol.  I do have one server that is just for my testing and Admin recording. I still have some learning on the service broker.

    I do thank you for all the information you have provided, this is more then I needed but it sounds like what I should do.


    Stacey W. A. Gregerson

  • It looks like a lot...but if you just copy and paste the T-SQL in order of the blocks that I posted and run each block by itself, you'll see first hand how it works.  Which should turn on any "ah-hah" light bulbs.

    Plus, it's a guaranteed service, so you don't miss any events due to server processing or overloading if that ever happens.

     

    Christopher Ford

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply