Passing Table Value Parameters to a stored procedure

  • We are using TVP's in a call to a stored procedure. The proc takes three arguments of which all are TVP's. This functionality does not seem to scale very well.

    No matter how many threads are calling the data access code that calls the proc, it always performs the same.

    We removed all functionality in the proc but this made almost no difference.

    It seems that the major time is spent in the passing of the TVP's from the .net dataaccess code to the stored procedure.

    Could there be that there are some locks involved?

    Is the TVP's not recommended if you call the stored proc at a high frequency?

    Do we have to send batches instead to achieve any performance?

  • No one has an answer?

  • Try providing some detail?

    http://www.sommarskog.se/share_data.html#tableparam

  • Ok, below is the data access code and the proc definition followed by the database trace.

    After tracing I have seen that the overhead seems to be from the ExecuteNonQuery call in the dataaccess until the database

    has generated the sql.

    Data access code:

    internal static void StoreMessage(dsC200MessageDeliverables dataSetC200MessageDeliverables)

    {

    // Since it is not possible to dispatch typed DataTable objects to the database -,

    // SQL Server 2008 does not understand typed DataSet:s - one needs to transfer

    // the contents of such typed DataTable:s into based DataTable:s.

    // This can be done by merging the typed objects into base objects.

    //

    DataTable vehicleOperationalData = new DataTable();

    DataTable positionData = new DataTable();

    DataTable driverSupportEvaluationData = new DataTable();

    vehicleOperationalData.Locale = CultureInfo.InvariantCulture;

    positionData.Locale = CultureInfo.InvariantCulture;

    driverSupportEvaluationData.Locale = CultureInfo.InvariantCulture;

    vehicleOperationalData.Merge(dataSetC200MessageDeliverables.VehicleOperationalAttributes);

    positionData.Merge(dataSetC200MessageDeliverables.PositionAttributes);

    driverSupportEvaluationData.Merge(dataSetC200MessageDeliverables.DriverSupportEvaluationAttributes);

    // Create an abstract database object for the Fleetmanagement database.

    //

    Database database = DatabaseFactory.CreateDatabase(Scpv2Constants.ConnectionStringNames.FleetManagement);

    // Use the abstract database object to create a SqlConnection object and

    // attempt to access the database and execute the stored procedure.

    //

    using (SqlConnection sqlConnection = (SqlConnection)database.CreateConnection())

    {

    // Define a command object for calling the stored procedure.

    // Note: The Parameters collection of the SqlCommand object automatically assigns the

    // DbType property of each parameter to Object, and the SqlDbType property to Structured.

    //

    SqlCommand sqlCommand = sqlConnection.CreateCommand();

    sqlCommand.CommandType = System.Data.CommandType.StoredProcedure;

    sqlCommand.CommandText = "VehicleData.InsertCurrentStatusIntoDataCache";

    sqlCommand.Parameters.AddWithValue("@VehicleOperationalData", vehicleOperationalData);

    sqlCommand.Parameters.AddWithValue("@PositionData", positionData);

    sqlCommand.Parameters.AddWithValue("@DriverSupportEvaluationData", driverSupportEvaluationData);

    // Open the connection, execute the command and close the connection.

    //

    if (sqlCommand.Connection.State != ConnectionState.Open)

    {

    sqlCommand.Connection.Open();

    }

    sqlCommand.ExecuteNonQuery();

    if (sqlCommand.Connection.State == ConnectionState.Open)

    {

    sqlCommand.Connection.Close();

    }

    }

    The stored proc:

    The @VehicleOperationalDatacontains one single row with about 50 properties.

    The @PositionData contains up to 40 rows of position information.

    The @DriverSupportEvaluationDatacontains one single row with 5 properties.

    CREATE PROCEDURE [VehicleData].[InsertCurrentStatusIntoDataCache]

    (

    @VehicleOperationalData VehicleData.UDTVehicleOperationalDataCache READONLY,

    @PositionData VehicleData.UDTPositionDataCache READONLY,

    @DriverSupportEvaluationData VehicleData.UDTDriverSupportEvaluationDataCache READONLY

    )

    This sql shows up in the profiler, and the sql takes no time to execute:

    declare @p1 VehicleData.UDTVehicleOperationalDataCache

    insert into @p1 values(132,6,1,9,'2010-04-01 06:08:46',0,1,99,1287576411,2042004201,200864016,628688412,182268930,1196619099,701209549,1977639025,1903413413,919902981,1154473410,1943214601,1968290260,332574586,1549013664,450589475,1567814154,870811069,176,232,4,148,2056159412,2,0,1,1,0,1,1,0,0,0,1,0,0,0,0,0,N'+68000874',N'F 1000000461312')

    declare @p2 VehicleData.UDTPositionDataCache

    insert into @p2 values(1,-2.487744,51.669033,155,168,'2010-04-01 06:08:46',0)

    declare @p3 VehicleData.UDTDriverSupportEvaluationDataCache

    insert into @p3 values(-1,-1,-1,-1)

    exec VehicleData.InsertCurrentStatusIntoDataCache @VehicleOperationalData=@p1,@PositionData=@p2,@DriverSupportEvaluationData=@p3

  • If the SQL Server execution time is effectively zero, that's a good hint that the issue lies elsewhere 🙂

    Network latency springs to mind, but yes - consider batching and streaming.

    See Table-Valued Parameters in SQL Server 2008 (ADO.NET)]

    That's a great link, with many options to give you some ideas.

  • According to this article below tempdb is not used for TVP's but sql server works with them partially in memory and partially on disk.

    In our scenario, if the sql server works with the TVP's on disk, there is not surprising if TVP's performs worse than an ordinary stored proc without TVP's.

    http://searchsqlserver.techtarget.com/tip/0,289483,sid87_gci1280004,00.html

    Temp tables vs. table variables in SQL Server

    Temp tables are stored physically on disk in tempdb, while table variables are stored partially in memory and partially on disk. This difference means a few things for your code. Since table variables are at least partially in memory, they can be faster than temp tables.

  • Finnlake (4/1/2010)


    According to this article below...

    ...which is complete nonsense. 😛

    Sunil Agarwal is a Program Manager in the SQL Server Storage Engine Group at Microsoft, read his views on the SQL Server Storage Engine Team Blog here:

    http://blogs.msdn.com/sqlserverstorageengine/archive/2008/03/30/sql-server-table-variable-vs-local-temporary-table.aspx

    You might like to read the related entries too - there's a lot of very good stuff there.

    SQL Server is not the cause of your performance issues - it is the design.

    Paul

  • SQL Server is not the cause of your performance issues - it is the design.

    What do you mean by that? I notice a overhead when are using three table valued parameters in a call to a stored proc.

    There is not much that can be redesigned here is there?

  • Finnlake (4/1/2010)


    SQL Server is not the cause of your performance issues - it is the design.

    What do you mean by that? I notice a overhead when are using three table valued parameters in a call to a stored proc.

    There is not much that can be redesigned here is there?

    I mean the limit on performance is down to the overall application design - not SQL Server's handling of TVPs.

    Batch the requests, use streaming, or otherwise tune your application/network/whatever.

    It is not the SQL Server that is failing to 'scale well' here.

  • In our case we cannot batch requests ie messages since we then would have to keep the messages in memory and if the power goes all messages are lost.

    We need to insert them in the database as fast as possible. And if TVP's is slower that calling a ordinary stored proc then we will not use it.

    We can now insert ~100 messages per second using TVP´s and if we can insert 200 or 300 messages per second by not using TVP's then we will not use it.

  • The whole point of TVPs is to allow you to pass a bunch of data at once, to minimize round-trips and improve efficiency.

    You might like to explore programming techniques that allow you to batch requests in a safe manner.

    Doing things one row at a time is never going to scale well.

    Anyway, good luck with your project.

  • Paul White NZ (4/1/2010)


    The whole point of TVPs is to allow you to pass a bunch of data at once, to minimize round-trips and improve efficiency.

    I think that's why the original poster was trying to use TVPs, and why he was surprised they appeared to be so slow...

    I think he knows that, too, because that's why he posted for help...

    Paul White NZ (4/1/2010)


    You might like to explore programming techniques that allow you to batch requests in a safe manner.

    In theory, TVPs were one.

    I suppose he could write to a text file and do a bulk insert from it. Of course, that's processing the data 4 times instead of 2, and forcing it to be at disk instead of memory speeds...

    Any other ideas?

  • david_wendelken (4/2/2010)


    I suppose he could write to a text file and do a bulk insert from it. Of course, that's processing the data 4 times instead of 2, and forcing it to be at disk instead of memory speeds...Any other ideas?

    Any method that reliably batches up the requests at the application/mid-tier level and then uses something like TVPs or Service Broker...there are lots of options.

  • What is the "shape" of the TVP's being passed? I mean, how many rows and columns per TVP and what kind (datatype) of data? Also, (mostly curiosity on my part) what is the data being passed in each TVP? I just went through this with a client... they were passing about a quarter million rows of data... it turns out that all they really needed to do was pass about 6 INTs and two dates and then do a cross join between two tables to produce the necessary rows which only took tenths of seconds compared to what they were doing.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Hi!

    Thanks very much for your answers. It seems like TVP's are not the ideal technique to use in our case then.

    In our system we receive a message with a lot of parameters and connected to this data is a set of positions.

    The set of positions we thought would be ideal for a TVP instead of using some kind of "hand-crafted" xml handling.

    Also normalizing the data using TVP's made the code and stored procedure look a lot nicer.

    So we will probably create a procedure taking all parameters and pass the set of positions to the stored proc as xml.

Viewing 15 posts - 1 through 15 (of 19 total)

You must be logged in to reply to this topic. Login to reply