Memory used by stored procedure call

  • Is there a way to determine how much memory is going to be used by a procedure call. The execution plan deals with IO, CPU but how about the memory?

    I would like to compare two procedures for performance and timewise they perform the same so the difference could be how effectively they use the server memory.

    For example ( and this is the simplest example I could come with) consider the following two pieces of code:

    IF EXISTS (SELECT * FROM Orders WHERE OrderID = 1)

    BEGIN

           --- DO STUFF

    END

    IF EXISTS (SELECT OrderID FROM Orders WHERE OrderID = 1)

    BEGIN

           --- DO STUFF

    END

    Is the second one going to use less memory then the first one because it lists only one field?

    ---------------------------------------------
    [font="Verdana"]Nothing is impossible.
    It is just a matter of time and money.[/font]

  • Hi,

    You can use for that purpose SQL Profiler.You can define a trace template, and in the events tab you can choiose Stored procedure.

    In dataColumns tab you can choose CPU and many others.

    I hope that it helped.

     


    Daniela

  • Daniela, Thanks for your reply but I looked at SQL Profiler before and there is nothing about memory.

    ---------------------------------------------
    [font="Verdana"]Nothing is impossible.
    It is just a matter of time and money.[/font]

  • select bucketid,cacheobjtype,objtype,objid,name,refcounts,usecounts,setopts,sqlbytes,sql

    from master.dbo.syscacheobjects with (nolock) left outer join

    xenon.dbo.sysobjects with (nolock)

    on objid=id

    where dbid= xxxx 

    order by objid

     

    put the dbid of your database into xxxx

    adjust query to suit, you're interested in sqlbytes

    [font="Comic Sans MS"]The GrumpyOldDBA[/font]
    www.grumpyolddba.co.uk
    http://sqlblogcasts.com/blogs/grumpyolddba/

  • Thanks for reply but this is not what I need.

    Maybe I did not write properly what I am looking for.

    The sqlbytes in syscacheobjects shows how much memory SQL Server allocated for the 'executable' code. I would like to find out how much memory SQL Server used during the runtime (the 'allocated' storage memory).

    Example:

    SELECT * FROM Table1 ORDER BY field1.

    If the field1 is not an index the SQL Server has to load the whole Table1 into memory and then perform the sorting (or if it is smart it creates the index on the fly or just loads the primary key field(s) and field1 only - don't really know how they do this). I would like to know how much memory the SQL server had to use in order to execute that code.

    One place that is close to what I am looking for is the memusage in sysprocesses but this one shows cumulative numbers and in pages so it is very hard to find out how much memory specific call used.

    ---------------------------------------------
    [font="Verdana"]Nothing is impossible.
    It is just a matter of time and money.[/font]

  • I don't really understand what you are trying to achieve or why - but - if you take the i/o count and multiply by 8k then that's how much data cache you use ( roughly ). The data will always be put into cache. I'd use profiler to count the i/o as QA doesn't always return all i/o counts.

    Read ahead is in 64k blocks so there's always a chance more than you ask is read into cache. each index will also put pages into cache ..  so if Q1 shows at 10 i/o in profiler and Q2 shows 5 i/o then Q2 will technically have used 50% the data cache then Q1.  There are some good web casts on this sort of thing by Kalen Delaney, Kimberly Tripp, + microsoft technet - look under sql2005 there are some cool white papers. The administrators companion by ms proess also deal with some of this too.

    [font="Comic Sans MS"]The GrumpyOldDBA[/font]
    www.grumpyolddba.co.uk
    http://sqlblogcasts.com/blogs/grumpyolddba/

  • Ok, let me ask a set of questions that can lead to where my thinking is going...

     Lets assume I have a stored procedure that reads some data from a physical table into a @Tmp table (a table variable).

    1. The I/O will show the amount of data read from the disk, stored into @Tmp and will equal the amount of data placed into cache. Correct?

    Within that stored procedure I create another @tmp2 table and reload some of the data from @Tmp into @tmp2.

    2. There should be no I/O since the data I am manipulating is already in memory, but what I am doing is reallocating another block of memory to hold this data in @tmp2. Correct?

    Then I declare several CHAR(8000) variables.

    3. That should allocate another block of memory.Right?

    Do you see where I am going with this? There are different types of memory allocations SQL Server has to use in order to perform it's duties. Some of it is the compiled code, some of this is the cache the SQL server uses for storing the data normally stored on the disk and the third part is the memory used during the processing of the code as a temporary storage of the variables, derived tables, etc...

    ---------------------------------------------
    [font="Verdana"]Nothing is impossible.
    It is just a matter of time and money.[/font]

  • you need Ken Hendserson's  Guru's guide to sql server internals

    isbn 0-201-70047-6

    btw don't always assume table variables are created in memory, if sql server thinks it better it will write them to tempdb. There's a technet/kb which states that both table variables and #temp tables may or may not be created in memory.

    for 32 bit o/s all that extra memory only caches data, you need 64bit to improve the other caches. 

    [font="Comic Sans MS"]The GrumpyOldDBA[/font]
    www.grumpyolddba.co.uk
    http://sqlblogcasts.com/blogs/grumpyolddba/

  • Colin,

    Thanks a lot.

    I just hoped I missed some built in feature that can get this info easily...

     

    ---------------------------------------------
    [font="Verdana"]Nothing is impossible.
    It is just a matter of time and money.[/font]

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply