Debugger For SQL

  • Or write up a short article. It's always good to see how someone tests and their results.

  • " I've rarely used the debugger, "

    you're a better developer than I (goes without saying)

    I don't always need to debug every new stored procedure I write but when I do having a working debugger is a real time-saver

  • The original benchmark is not available but was C++, SQL Server 2000 and various other components that were on a physically isolated network without any routers. The schema was about 20 tables and a mixed load of 16 different business functions were benchmarked with:

    SQL Server 2000 on a 4CPU 4Gb RAM server.

    .NET on 4 1 CPU Blades for the middle tier

    .NET on 2 1 CPU Blades to simulate the clients

    MS MQ on 1 1CPU Blade

    One server for infrastructure running Active Directory and DNS

    One server for Exchange.

    There was also a file server whose usage was for backups, source, and saving various scripts but was not used during the benchmarks.

    Attached is a zip with 22 files for a simpler benchmark that runs 8 of the solutions as the same time.

    The union solution uses half the resources versus the array solution and runs in half the time. All times are in milliseconds:

    Average are:

    Solution Duration CPU IO

    Array 211.75 40.88 3.25

    Union 111.75 21.75 2.13

    Diff 47% 47% 35%

    Standard Deviations are:

    Solution Duration CPU IO

    Array 28.75 21.63 0.88

    Union 25.94 13.75 0.91

    Here are the names of the files, the purpose, and any revisions that need to be made:

    Benchmark_Schema.sql tables and stored procedure source

    DataCache.sql Insure that data in is cache

    SQL Commands for the two solutions - WAITFOR time needs to be revised.

    Benchmark_Array.sql - Array solution

    Benchmark_Union.sql - UNION solution

    Command files - SQL Server name and directory names need to be revised.

    DataCache.cmd - command file to run DataCache.sql

    Run_array_(1-8).cmd - to run eight of the Benchmark_Array.sql simultaneously

    Run_union_(1-8).cmd - to run eight of the Benchmark_UNION.sql simultaneously

    SQL = Scarcely Qualifies as a Language

  • Carl, I just performed the following test. Please tell me what you're doing differently in this benchmark than what I'm doing:

    create table dbo.Calendar (

    Date datetime primary key,

    constraint CK_Date check

    (datepart(hour, date) = 0

    and

    datepart(minute, date) = 0

    and

    datepart(second, date) = 0

    and

    datepart(millisecond, date) = 0),

    DateYear as datepart(year, date),

    DateMonth as datepart(month, date),

    DateDay as datepart(day, date),

    DateWeekday as datepart(weekday, date));

    go

    insert into dbo.calendar (date)

    select dateadd(day, number, '1/1/2000')

    from dbo.numbers;

    go

    create function [dbo].[StringParserXML]

    (@String_in varchar(max),

    @Delimiter_in char(10))

    returns @Parsed table (

    Row int,

    Parsed varchar(100))

    as

    -- This one is faster than StringParser2, but it doesn't correctly handle

    -- XML-specific characters, such as "<" or "&". StringParser2 will handle those

    -- without difficulty.

    begin

    if right(@string_in, 1) = @delimiter_in

    select @string_in = left(@string_in, len(@string_in) - 1)

    declare @XML xml

    select @xml = ' '

    insert into @parsed(row, parsed)

    select row_number() over (order by x.i.value('.', 'varchar(100)')), x.i.value('.', 'varchar(100)')

    from @xml.nodes('//i') x(i)

    return

    end;

    go

    create proc ParamsTest1

    (@DateList_in varchar(max))

    as

    set nocount on

    declare @Date datetime

    select @date = date

    from dbo.calendar

    where date in

    (select parsed

    from dbo.stringparserxml(@datelist_in, '|'));

    go

    create proc ParamsTest2

    as

    set nocount on

    declare @Date datetime

    select @date = date

    from dbo.calendar

    where date in

    (select date

    from #DateList);

    go

    create table #T (

    Start datetime,

    Finish datetime,

    Prc varchar(100))

    go

    set nocount on

    insert into #T (Start, Prc)

    select getdate(), 1

    go

    exec dbo.paramstest1

    '1/1/2000|2/1/2000|3/1/2000|4/1/2000|5/1/2000|6/1/2000|7/1/2000|8/1/2000|9/1/2000|10/1/2000|11/1/2000|12/1/2000'

    go 1000

    update #T

    set finish = getdate()

    where finish is null

    go

    insert into #T (Start, Prc)

    select getdate(), 2

    go

    create table #DateList (

    Date datetime)

    insert into #DateList (date)

    select '1/1/2000' union all

    select '2/1/2000' union all

    select '3/1/2000' union all

    select '4/1/2000' union all

    select '5/1/2000' union all

    select '6/1/2000' union all

    select '7/1/2000' union all

    select '8/1/2000' union all

    select '9/1/2000' union all

    select '10/1/2000' union all

    select '11/1/2000' union all

    select '12/1/2000'

    exec dbo.paramstest2

    drop table #DateList

    go 1000

    update #T

    set finish = getdate()

    where finish is null

    go

    select * , datediff(millisecond, start, finish)

    from #T

    I got 16,236 milliseconds for 1,000 runs of the temp table version, and 2,593 milliseconds for the string parsing version. So, the string parsing version took just over 1/6th the time the temp table version took.

    I'm assuming from this that you ran some other benchmark on this.

    This test was done in SQL 2005.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • And, of course, I ask for your benchmark, and we post at the same time. 🙂

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • GSquared:

    Thanks for your benchmark but what I hae found is that when you run the benchmark from twice as many clients as there a CPUs on the database server, the duration of the UDF will increase dramatically.

    Since all of the application I have worked on have been multi-user, this does make a difference in the results.

    Using the simple Sales Order schema that was provided, can you code a solution using XML as a communication method?

    SQL = Scarcely Qualifies as a Language

  • Here's some "simpler" code for the "single run" test... the cteTally split blows the SELECT/UNION ALL method away by 51.7 times on my box. I didn't put it in a code box just so you don't have to scroll as much... 😛

    Here's the test code...

    [font="Courier New"]--=================================================================================================

    -- This section contains "common" setup code for the test

    --=================================================================================================

    --===== Setup for a "pretty" output

    SET NOCOUNT ON

    --===== We're going to store the results of each "split" in Temp tables.

    -- This is just to make sure they don't already exist to make rerunning

    -- the code easier.

    IF OBJECT_ID('TempDB..#CarlTest','U') IS NOT NULL

    DROP TABLE #CarlTest

    IF OBJECT_ID('TempDB..#JBMTest','U') IS NOT NULL

    DROP TABLE #JBMTest

    --===== Create the temp tables for both tests because a stored procedure won't allow

    -- the temp table to be exposed otherwise. These are NOT included in the execution time

    -- even though they should be.

    CREATE TABLE #CarlTest (Element VARCHAR(8000))

    CREATE TABLE #JBMTest (Element VARCHAR(8000))

    --===== Declare and set variable to control the number of elements for the test

    DECLARE @Elements INT

    SET @Elements = 1000

    --===== Declare the dynamic SQL variables

    DECLARE @SQLSelects VARCHAR(MAX) --For Carl's method

    DECLARE @SQLCSV VARCHAR(MAX) --For Jeff's method

    --=================================================================================================

    -- This section contains the setup and test Carl's code

    --=================================================================================================

    --===== Populate the variable with a given number of SELECT/UNION ALL's

    -- for Carl's method. This is NOT included in the execution time.

    -- Basically, this would be the passed parameter.

    SELECT TOP (@Elements)

    @SQLSelects = COALESCE(@SQLSelects + ' UNION ALL ','')

    + 'SELECT ''ELEMENT'

    + CAST(ROW_NUMBER() OVER (ORDER BY sc1.ID) AS VARCHAR(10))

    + ''''

    FROM Master.sys.SysColumns sc1 WITH (NOLOCK)

    CROSS JOIN Master.sys.SysColumns sc2 WITH (NOLOCK)

    --===== Prepend the dynamic SQL with the required INSERT statement

    SELECT @SQLSelects = 'INSERT INTO #CarlTest (Element) ' + @SQLSelects

    --===== Can't print the whole dynamic SQL, so let's show the length of it

    SELECT LEN(@SQLSelects)

    --===== Turn on various statistics and run Carl's method.

    -- This IS the ONLY part that is timed (right after the printed header)

    PRINT REPLICATE('=',100)

    PRINT '========== Carl''s Method =========='

    PRINT REPLICATE('=',100)

    SET STATISTICS IO ON

    SET STATISTICS TIME ON

    EXEC (@SQLSelects)

    SET STATISTICS TIME OFF

    SET STATISTICS IO OFF

    --===== Verify the content of Carl's table manually

    -- Again, not included in any timing

    SELECT * FROM #CarlTest

    --=================================================================================================

    -- This section contains the setup and test Jeff's code

    --=================================================================================================

    --===== Populate the variable with a given number of CSV elements

    -- for Jeff's method. This is NOT included in the execution time.

    -- Basically, this would be the passed parameter.

    SELECT TOP (@Elements)

    @SQLCSV = COALESCE(@SQLCSV + ',','')

    + 'ELEMENT'

    + CAST(ROW_NUMBER() OVER (ORDER BY sc1.ID) AS VARCHAR(10))

    FROM Master.sys.SysColumns sc1 WITH (NOLOCK)

    CROSS JOIN Master.sys.SysColumns sc2 WITH (NOLOCK)

    --===== Can't print the whole parameter SQL, so let's show the length of it

    SELECT LEN(@SQLCSV)

    --===== Turn on various statistics and run Jeff's method.

    -- This IS the ONLY part that is timed (right after the printed header)

    PRINT REPLICATE('=',100)

    PRINT '========== Jeff''s Method =========='

    PRINT REPLICATE('=',100)

    SET STATISTICS IO ON

    SET STATISTICS TIME ON

    ;WITH

    cteTally AS

    (

    SELECT TOP (LEN(@SQLCSV)+1)

    ROW_NUMBER() OVER (ORDER BY sc1.ID) AS N

    FROM Master.sys.SysColumns sc1

    CROSS JOIN Master.sys.SysColumns sc2

    )

    INSERT INTO #JBMTest (Element)

    SELECT SUBSTRING(','+@SQLCSV, t.N+1, CHARINDEX(',', @SQLCSV+',', t.N)-t.N)

    FROM cteTally t

    WHERE SUBSTRING(','+@SQLCSV, t.N, 1) = ','

    SET STATISTICS TIME OFF

    SET STATISTICS IO OFF

    --===== Verify the content of Jeff's table manually

    -- Again, not included in any timing

    SELECT * FROM #JBMTest

    PRINT REPLICATE('=',100)

    PRINT '========== End of tests =========='

    PRINT REPLICATE('=',100)[/font]

    ... and here's the results...

    [font="Courier New"]====================================================================================================

    ========== Carl's Method ==========

    ====================================================================================================

    SQL Server parse and compile time:

    CPU time = 0 ms, elapsed time = 1 ms.

    Table '#CarlTest___________________________________________________________________________________________________________00000000006E'. Scan count 0, logical reads 1002, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

    SQL Server Execution Times:

    CPU time = 15 ms, elapsed time = 11 ms.

    SQL Server Execution Times:

    CPU time = 4609 ms, elapsed time = 4703 ms.

    ====================================================================================================

    ========== Jeff's Method ==========

    ====================================================================================================

    Table '#JBMTest____________________________________________________________________________________________________________00000000006F'. Scan count 0, logical reads 1002, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

    Table 'syscolrdb'. Scan count 2, logical reads 95, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

    Table 'syscolpars'. Scan count 2, logical reads 9, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

    SQL Server Execution Times:

    CPU time = 94 ms, elapsed time = 130 ms.

    ====================================================================================================

    ========== End of tests ==========

    ====================================================================================================

    [/font]

    Yes, I understand the need for "hitting" the code simultaneously... I'm setting up to do that now. Of course, I'll post that, as well.:w00t:

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Steve Jones - Editor (8/18/2008)


    Or write up a short article. It's always good to see how someone tests and their results.

    Heh... but since he challenged my method, I get the first couple of cracks at him. 😛

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Actually, I've gotta be up in 5 hours... I'll finish this tomorrow evening. Heh... don't change that channel! 😛

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Jeff, regarding the benchmark method you are using, found a curious situation.

    If this SQL is run, statistics time reports duration and cpu of of zero:

    set statistics time on

    insert into #SalesOrder

    SELECT 880, 12, 2, 32, 0 union all

    .... 72 union all statements ...

    When the same SQL is put into a varchar(max) variable, the run time and cpu time are 78 ms.

    SET @SQLCmd =

    'insert into #SalesOrder

    SELECT 880, 12, 2, 32, 0 union all

    .... 72 union all statements ...

    '

    exec (@SQLCmd )

    SQL = Scarcely Qualifies as a Language

  • Carl Federl (8/19/2008)


    Jeff, regarding the benchmark method you are using, found a curious situation.

    If this SQL is run, statistics time reports duration and cpu of of zero:

    set statistics time on

    insert into #SalesOrder

    SELECT 880, 12, 2, 32, 0 union all

    .... 72 union all statements ...

    When the same SQL is put into a varchar(max) variable, the run time and cpu time are 78 ms.

    SET @SQLCmd =

    'insert into #SalesOrder

    SELECT 880, 12, 2, 32, 0 union all

    .... 72 union all statements ...

    '

    exec (@SQLCmd )

    I think that you are effectively recording the compile-time of the second one (the first one is compiled before the timer starts).

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • Sorry for the latency. Really busy for a few days.

    In my test, I didn't actually transmit any data with XML. I just used an XML data type to parse the delimited list.

    If the UDF is a problem, the parser can be moved right into the proc where it will be used. Will be even faster, if you know it will always be getting a comma-delimited list of integers, as opposed to the UDF, which can take a lot of different variations and has to accommodate them.

    I don't have time for a speed test right now, but I'll see if I can get some time in the next day or two.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • Carl Federl (8/19/2008)


    Jeff, regarding the benchmark method you are using, found a curious situation.

    If this SQL is run, statistics time reports duration and cpu of of zero:

    set statistics time on

    insert into #SalesOrder

    SELECT 880, 12, 2, 32, 0 union all

    .... 72 union all statements ...

    When the same SQL is put into a varchar(max) variable, the run time and cpu time are 78 ms.

    SET @SQLCmd =

    'insert into #SalesOrder

    SELECT 880, 12, 2, 32, 0 union all

    .... 72 union all statements ...

    '

    exec (@SQLCmd )

    Yep... I agree, Carl... but I've gotta ask... how would you pass your "parameter" to the server otherwise? And, I appologize if it's in your code... I just didn't have the time to review it all...

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • In reply to Jeff Moden's question: Under the "union all" solution, there are no "parameter", just SQL Statements that are sent to the server in the normal manner as a command batch.

    SQL = Scarcely Qualifies as a Language

  • Just a quick comment on the original post: ApexSQL is developing a debugger for 2005/2008 that blows away the one MS provides with 2008. One of it's better features is a code profiling system that returns execution and duration counts per line of code. It also allows you to see temp table contents during a debugging session. They have had a few issues with keeping the project moving forward, but those are behind them now and I am hopeful for a release within 6 weeks.

    Disclaimer: I have a close relationship with Apex, including testing and some help with product improvements.

    Best,
    Kevin G. Boles
    SQL Server Consultant
    SQL MVP 2007-2012
    TheSQLGuru on googles mail service

Viewing 15 posts - 31 through 45 (of 56 total)

You must be logged in to reply to this topic. Login to reply