Retrieving Bulk Data

  • Hi Martin,

    The main reason for getting an even Worse result when executing the loop is that the results are being displayed in the resultspane of QA.

    If You rewrite the testloop so that the results are not displayed you will get a much better timing.

    For example: On our server retrieving 60000 rows from a table with INT PK CLUSTERED using your testloop takes:

    1. 35 seconds when nothing is displayed

    2. 8 min an counting... when each row is displayed.

    ( This table has 1.300.000 rows and 67 columns&nbsp.

    I modified Your testloop so that it retireves all the rows 1 by 1 and inserts them into another table. You may want to try it.

    -- Create duplicate table..

    SELECT * INTO GenericReport_dup FROM GenericReport WHERE 1 = 0

    -- NOTE! if UID is Identity then alter the column to be non Identity in GenericReport_dup Now

    CREATE PROCEDURE dbo.VMR_BenchMark2 AS

    SET NOCOUNT ON -- Always do this in procs

    DECLARE @iMaxUID IntDECLARE @iMinUID Int

    PRINT 'Start Time: ' + cast( getdate()  as nvarchar(20))

    SELECT @iMaxUID = MAX(UID) FROM GenericReport

    SELECT @iMinUID = MIN(UID) FROM GenericReport

    WHILE @iMinUID <= @iMaxUID

    BEGIN

     INSERT GenericReport_dup

     SELECT * FROM GenericReport

     WHERE UID = @iMinUID

     SET @iMinUID = @iMinUID + 1

    END

    PRINT 'End Time: ' + cast(getdate() as nvarchar(20)

    GO

    -- Do the stuff

    EXEC dbo.VMR_BenchMark2

    SELECT COUNT(*) FROM GenericReport_dup

    --TRUNCATE TABLE GenericReport_dup

    --DROP TABLE GenericReport_dup

    GL,

    /rockmoose


    You must unlearn what You have learnt

Viewing post 16 (of 15 total)

You must be logged in to reply to this topic. Login to reply