higher reads when using ADO.NET thatn when using Query Analyzer

  • I was doing some testing and noticed that when I execute a stored procedure from my ADO.NET (using the Command class of the System.Data.SQLClient namespace) the number of reads on a relatively simple query were significantly higher than when executing from Query Analyzer. As a matter of fact I have not found a case where the reads when executed from ADO.NET are lower than the reads when executing in QA. Sometimes they may be the same but QA is always has equal or fewer reads. Any reason why this is and is there any thing I can do to get ADO.NET to perform like QA?

    --Buddy

  • You might want to look into the execution plan that is used.  I've seen similar differences in performance from running the same stored procedure from various sources.  It was using two, or more, execution plans for running the same stored procedure.  UPDATE STATISTICS or sp_updatestats might resolve this for you by resetting.  You can also use the Manage Statistics tool in Query Analyzer.

  • There are many ways to retrieve data in ADO.NET, and there are performance trade-offs with each.  Also the data provider, object properties, and opening/closing method come into play.  In other words, we need to see some code.

  • It looks something like this:

                Dim sqlCmd As New sqlCommand()

                Dim myReader As SqlDataReader

                With sqlCmd

                    .Connection = cn

                    .CommandType = CommandType.StoredProcedure

                    .CommandText = "dbo.myProc"

                    .Parameters.Add(new sqlParameter("@myparam",  SqlDbType.Int,       4)).value = myparam

                    myReader = .ExecuteReader()

                End With

    The stored procedure is excrutiatingly simple:

            create proc dbo.myProc

                    @myparam int

            AS

            SET NOCOUNT ON

            SELECT * FROM dbo.myTable WHERE tblPKField = @myparam

    Profiler shows that this proc takes 4 reads when executed from QA and 13 when executed for .NET.  I tried executing it with a commandType of text but I got the same 13 reads.  The table in the proc has only 4 records in it right now (average record size is about 125 bytes) and there is only one index, clustered on the primary key. 

  • I have seen a similar behaviour caused by different locking strategies. It's sometimes not clear to me, why SQL server uses row-level locking or page-level or table-level-locking, but sometimes a query from connection A runs faster (i.e. fewer locks, i.e. page locks) then on connection B (i.e. more locks, i.e. row-level locks) even though it's the same SQL and even though it's the same execution plan.

    There is some "intelligence" built into SQL server when to choose what locking machanism. Memory usage is one aspect SQL Server looks at. There is more than that, but I don't know more details. However, you can check with SQL Profiler how many locks are used by your query in ADO and QA to see if locks are an issue.

    Kay

  • I ran the example given (added bigger table to give reads but almost exact) and I could not reproduce the behavior. 

    .Net SqlClient Data Provider

    RPC:Completed

    exec dbo.myProc @myparam = 'ALL'

    358 reads

    SQL Query Analyzer

    SQL:BatchCompleted

    exec dbo.myProc @myparam = 'ALL'

    358 reads 

     

    kay brought up a good tip also, but it's hard to say what actually is the issue in your environment

    -Max

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply