Tally OH! An Improved SQL 8K “CSV Splitter” Function

  • Paul White (5/14/2014)


    Is everyone running Jeff's original/updated test rig from the article to get these results?

    If so, you might not be testing what you think you are. There is a weakness in the test script that can result in the Csv8K table containing duplicates (i.e. every row is the same). This is related to hiding the NEWID call in the view. In (at least) SQL Server 2012 and 2014, this NEWID value may be cached, resulting in every row being the same.

    It is obviously not a very good test to simply split the string once, cache the results in a table spool, and replay that for each duplicate row from the Csv8K table. This particularly (and unfairly) benefits the non-CLR splitters by the way.

    As a quick hack fix, you can use undocumented trace flag 8690 to prevent caching the NEWID value in this part of the script:

    SELECT @SQL = ISNULL(@SQL,'')+

    '

    IF OBJECT_ID(''dbo.Csv8K'',''U'') IS NOT NULL DROP TABLE dbo.Csv8K;

    SELECT *

    INTO dbo.Csv8K

    FROM dbo.CreateCsv8K

    ('+CAST(NumberOfRows AS VARCHAR(10))+', '

    +CAST(NumberOfElements AS VARCHAR(10))+', '

    +CAST(MinElementLength AS VARCHAR(10))+', '

    +CAST(MaxElementLength AS VARCHAR(10))+') OPTION (QUERYTRACEON 8690); --# of Rows, # of Elements, MIN element length, MAX element length

    EXEC dbo.TestEachFunction '+CAST(NumberOfRows AS VARCHAR(10)) +', '

    +CAST(NumberOfElements AS VARCHAR(10))+', '

    +CAST(MinElementLength AS VARCHAR(10))+', '

    +CAST(MaxElementLength AS VARCHAR(10))+';

    '

    FROM cteControl

    Nevertheless, you need to be careful to check the Csv8K table to check the rows are actually different, as intended.

    One more thing. The CLR code isn't fully compiled to native code on creation. The first call to the function may take a full second or more as just-in-time compilation takes place. This also seems unfair, so I add an initial do-nothing split just after the CLR function creation to the test script. This ensures at least most of the CLR is compiled before the main tests start:

    SELECT * FROM dbo.Split(N'A,B,C,D', N',');

    FWIW a test I just ran with the fixes in place produced:

    +-----------------------------------------------------+

    ¦ SplitterName ¦ TotalDuration ¦ AvgDuration ¦

    ¦-----------------------+---------------+-------------¦

    ¦ DelimitedSplit8K_LEAD ¦ 19.07600 ¦ 0.389306 ¦

    ¦ Split ¦ 12.35800 ¦ 0.252204 ¦

    +-----------------------------------------------------+

    BTW, that's CLR Split vs Usman's latest version of DelimitedSplit8K_LEAD.

    Thank you Paul for this very useful information.

    A thought, should we put together a data set for this purpose, at least we can eliminate one variable from the equation?

    😎

  • Paul White (5/14/2014)


    Is everyone running Jeff's original/updated test rig from the article to get these results?

    If so, you might not be testing what you think you are. There is a weakness in the test script that can result in the Csv8K table containing duplicates (i.e. every row is the same). This is related to hiding the NEWID call in the view. In (at least) SQL Server 2012 and 2014, this NEWID value may be cached, resulting in every row being the same.

    it wasn't the case in my testing at least 🙂

    It is obviously not a very good test to simply split the string once, cache the results in a table spool, and replay that for each duplicate row from the Csv8K table. This particularly (and unfairly) benefits the non-CLR splitters by the way.

    As a quick hack fix, you can use undocumented trace flag 8690 to prevent caching the NEWID value in this part of the script:

    SELECT @SQL = ISNULL(@SQL,'')+

    '

    IF OBJECT_ID(''dbo.Csv8K'',''U'') IS NOT NULL DROP TABLE dbo.Csv8K;

    SELECT *

    INTO dbo.Csv8K

    FROM dbo.CreateCsv8K

    ('+CAST(NumberOfRows AS VARCHAR(10))+', '

    +CAST(NumberOfElements AS VARCHAR(10))+', '

    +CAST(MinElementLength AS VARCHAR(10))+', '

    +CAST(MaxElementLength AS VARCHAR(10))+') OPTION (QUERYTRACEON 8690); --# of Rows, # of Elements, MIN element length, MAX element length

    EXEC dbo.TestEachFunction '+CAST(NumberOfRows AS VARCHAR(10)) +', '

    +CAST(NumberOfElements AS VARCHAR(10))+', '

    +CAST(MinElementLength AS VARCHAR(10))+', '

    +CAST(MaxElementLength AS VARCHAR(10))+';

    '

    FROM cteControl

    Nevertheless, you need to be careful to check the Csv8K table to check the rows are actually different, as intended.

    I am not amazed at all getting this new info from you. I am used to it now :hehe: On serious note, it is always great to have you live on any thread as we learn things which we might not even think to look into. You are really a true inspiration.

    One more thing. The CLR code isn't fully compiled to native code on creation. The first call to the function may take a full second or more as just-in-time compilation takes place. This also seems unfair, so I add an initial do-nothing split just after the CLR function creation to the test script. This ensures at least most of the CLR is compiled before the main tests start:

    SELECT * FROM dbo.Split(N'A,B,C,D', N',');

    FWIW a test I just ran with the fixes in place produced:

    +-----------------------------------------------------+

    ¦ SplitterName ¦ TotalDuration ¦ AvgDuration ¦

    ¦-----------------------+---------------+-------------¦

    ¦ DelimitedSplit8K_LEAD ¦ 19.07600 ¦ 0.389306 ¦

    ¦ Split ¦ 12.35800 ¦ 0.252204 ¦

    +-----------------------------------------------------+

    BTW, that's CLR Split vs Usman's latest version of DelimitedSplit8K_LEAD.

    Well, that kind of performance output we see more often. So sad I dont have sql 2014 yet :angry:

  • Eirikur Eiriksson (5/14/2014)


    A thought, should we put together a data set for this purpose, at least we can eliminate one variable from the equation?

    I have no objections. See what everyone else thinks. The tricky thing in this forum environment is making sure everyone knows where to find the data set (and even that a new one is being used). It should be hosted somewhere we can all see of course, I just mean keeping the link/information visible as time goes on can be challenging in this format.

  • Paul White (5/14/2014)


    Eirikur Eiriksson (5/14/2014)


    A thought, should we put together a data set for this purpose, at least we can eliminate one variable from the equation?

    I have no objections. See what everyone else thinks. The tricky thing in this forum environment is making sure everyone knows where to find the data set (and even that a new one is being used). It should be hosted somewhere we can all see of course, I just mean keeping the link/information visible as time goes on can be challenging in this format.

    And what about the min max length, number of elements variations..... :hehe:

    I always say it all depends upon the requirements one have. But one data set could mean that we would have more faith while relating the results. 😎

  • BTW, can someone test the physical tally table splitters as well? @paul-2 Can you be so kind to me :unsure:

    SELECT TOP 11000

    IDENTITY(INT,1,1) AS N

    INTO dbo.Tally

    FROM sys.all_columns ac1

    CROSS JOIN sys.all_columns ac2

    ;

    --===== Add a CLUSTERED Primary Key to maximize performance

    ALTER TABLE dbo.Tally

    ADD CONSTRAINT PK_Tally_N

    PRIMARY KEY CLUSTERED (N) WITH FILLFACTOR = 100

    GO

    IF OBJECT_ID('dbo.Split8KTally_With_N_PLUS_1','IF') IS NOT NULL

    DROP FUNCTION [dbo].Split8KTally_With_N_PLUS_1

    GO

    --===== Tally Table (Split8KTally_With_N_PLUS_1 ITVF) ===============================================================================

    /*===== THIS FUNCTION USES RUNTIME CALCULATION "N + 1" INSTEAD OF USING THE PRE-CALCULATED COLUMN "N1" OF THE TALLY TABLE===========*/

    CREATE FUNCTION dbo.Split8KTally_With_N_PLUS_1 (

    @pString VARCHAR(8000), @pDelimiter VARCHAR(1)

    )

    RETURNS TABLE

    AS

    RETURN

    /*======= FIND THE FIRST DELIMITED ITEM EXPLICITLY, AS WE SKIPPED THE FIRST ITEM TO AVOID "OR" CONDITION IN THE WHERE CLAUSE=====*/

    SELECT ItemNumber = CAST(0 AS BIGINT) --STARTING WITH ZERO FOR SORT AND TO AVOID ROW_NUMBER + 1 LATER

    , ItemValue = SUBSTRING(@pString, 1,

    ISNULL(NULLIF(CHARINDEX(@pDelimiter,

    @pString COLLATE Latin1_General_BIN), 0)

    - 1, 8000 ))

    UNION ALL -- CONCATENATE THE OUTPUTS

    /*======= FIND REST OF THE DELIMITED ITEMS USING TALLY TABLE ======================================*/

    SELECT ItemNumber = ROW_NUMBER() OVER ( ORDER BY [N] )

    , ItemValue = SUBSTRING(@pString, (N+1),

    ISNULL(NULLIF(CHARINDEX(@pDelimiter ,

    @pString COLLATE Latin1_General_BIN, (N+1)), 0)

    - (N+1), 8000) )

    FROM [dbo].[Tally] WITH ( NOLOCK )--NOLOCK HINT IS NOT NECESSARY

    WHERE [N] BETWEEN 1 AND LEN(@pString)

    AND SUBSTRING(@pString, [N], 1) = @pDelimiter

    GO

    ;

  • Usman Butt (5/14/2014)


    BTW, can someone test the physical tally table splitters as well? @paul-2 Can you be so kind to me :unsure:

    I'll do it as well, don't hold your breath:Whistling: but soon:cool:

  • mburbea (5/14/2014)


    Since the slowdown seems to be the virtual tally table I'd wonder if a CLR function that just spits out a streaming number table might be able to improve the performance. Unfortunately, I haven't a clue how to write such a thing.

    Even though Paul has already addressed the timing discrepancies of the CLR splitter, it still might be interesting to test this theory of using a CLR-based number generator in the T-SQL splitters. I have absolutely no time to test, but I can at least say that such a generator already exists in SQL# ( http://www.SQLsharp.com )[/url] :-). The function is called Util_GenerateInts and can be used as follows:

    SELECT IntVal FROM SQL#.Util_GenerateInts(1, 8000, 1);

    Just be sure to keep in mind the initial load time issue that Paul mentioned and if need be, place a dummy call to the function at the top of the script to force the Assembly to be loaded prior to using the function in a real test case. Something like:

    SELECT IntVal FROM SQL#.Util_GenerateInts(1, 1, 1);

    Take care,

    Solomon...

    SQL#https://SQLsharp.com/ ( SQLCLR library ofover 340 Functions and Procedures)
    Sql Quantum Lifthttps://SqlQuantumLift.com/ ( company )
    Sql Quantum Leaphttps://SqlQuantumLeap.com/ ( blog )
    Info sitesCollations     •     Module Signing     •     SQLCLR

  • Solomon Rutzky (5/15/2014)


    ...it still might be interesting to test this theory of using a CLR-based number generator in the T-SQL splitters...

    I agree, though one has to ask if you can use SQLCLR for a number generator, why not use it for the split too? :ermm:

    As an aside, is the source for Util_GenerateInts available? It seems unlikely to be 'secret sauce', I'm just curious to see if you use any interesting techniques or if it is just entirely straightforward.

  • Paul White (5/14/2014)


    Is everyone running Jeff's original/updated test rig from the article to get these results?

    If so, you might not be testing what you think you are. There is a weakness in the test script that can result in the Csv8K table containing duplicates (i.e. every row is the same). This is related to hiding the NEWID call in the view. In (at least) SQL Server 2012 and 2014, this NEWID value may be cached, resulting in every row being the same.

    It is obviously not a very good test to simply split the string once, cache the results in a table spool, and replay that for each duplicate row from the Csv8K table. This particularly (and unfairly) benefits the non-CLR splitters by the way.

    As a quick hack fix, you can use undocumented trace flag 8690 to prevent caching the NEWID value in this part of the script:

    SELECT @SQL = ISNULL(@SQL,'')+

    '

    IF OBJECT_ID(''dbo.Csv8K'',''U'') IS NOT NULL DROP TABLE dbo.Csv8K;

    SELECT *

    INTO dbo.Csv8K

    FROM dbo.CreateCsv8K

    ('+CAST(NumberOfRows AS VARCHAR(10))+', '

    +CAST(NumberOfElements AS VARCHAR(10))+', '

    +CAST(MinElementLength AS VARCHAR(10))+', '

    +CAST(MaxElementLength AS VARCHAR(10))+') OPTION (QUERYTRACEON 8690); --# of Rows, # of Elements, MIN element length, MAX element length

    EXEC dbo.TestEachFunction '+CAST(NumberOfRows AS VARCHAR(10)) +', '

    +CAST(NumberOfElements AS VARCHAR(10))+', '

    +CAST(MinElementLength AS VARCHAR(10))+', '

    +CAST(MaxElementLength AS VARCHAR(10))+';

    '

    FROM cteControl

    Nevertheless, you need to be careful to check the Csv8K table to check the rows are actually different, as intended.

    I'd not seen that happen in my testing but know that such a thing could happen. Thanks for the heads up and the patch. When I get my head above water, I'll see if I can incorporate it. Thanks! 🙂

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Jeff Moden (5/15/2014)


    I'd not seen that happen in my testing but know that such a thing could happen. Thanks for the heads up and the patch. When I get my head above water, I'll see if I can incorporate it. Thanks! 🙂

    Yes it was always a calculated risk. It seems the new cardinality estimator in 2014 results in a plan favouring a table spool, which effectively caches a single NEWID call. The semantics of NEWID are a bit of a mess, but that's a whole other discussion.

    Please don't incorporate the hack fix into your fine script - it is undocumented and unsupported after all. No doubt some revision to the script will be necessary, but please try to avoid 8690. Cheers.

  • Paul White (5/15/2014)


    Solomon Rutzky (5/15/2014)


    ...it still might be interesting to test this theory of using a CLR-based number generator in the T-SQL splitters...

    I agree, though one has to ask if you can use SQLCLR for a number generator, why not use it for the split too? :ermm:

    As an aside, is the source for Util_GenerateInts available? It seems unlikely to be 'secret sauce', I'm just curious to see if you use any interesting techniques or if it is just entirely straightforward.

    From a pragmatic standpoint you are absolutely correct: it makes more sense to just go with the CLR-based splitter. But, this forum--at 63 pages now--is more about testing and learning than anything related to pragmatism ;-). So, it could prove interesting to compare the effect of using a CLR-based number generator in place of the inline tally table. Given that string splitting is only one use of tally tables, if the CLR-based INT generator holds up against the inline tally table, then it would seem to open up more possibilities of where SQLCLR can be of help to people.

    No, nothing special about the code; it is entirely straightforward. But, I will send the relevant portion to you.

    Take care,

    Solomon..

    SQL#https://SQLsharp.com/ ( SQLCLR library ofover 340 Functions and Procedures)
    Sql Quantum Lifthttps://SqlQuantumLift.com/ ( company )
    Sql Quantum Leaphttps://SqlQuantumLeap.com/ ( blog )
    Info sitesCollations     •     Module Signing     •     SQLCLR

  • Thank you Paul for figuring out why this benchmark was suddenly flying after upgrading from 2008 R2 to 2014. It's still nice and fast but a lot more reasonable.

    Out of humor I created a SQLCLR function of just a streaming number table it's csharp code is as you'd expect

    [Microsoft.SqlServer.Server.SqlFunction(

    DataAccess = DataAccessKind.None,

    IsDeterministic = true, IsPrecise = true,

    SystemDataAccess = SystemDataAccessKind.None,

    FillRowMethodName = "FillRow_ReturnValues",

    TableDefinition = "n Int")]

    public static IEnumerable Numbers(SqlInt32 count)

    {

    return Enumerable.Range(1,count);

    }

    private static void FillRow_ReturnValues(object obj, out SqlInt32 n)

    {

    n = (int)obj;

    }

    If anyone would like I can cost the varbinary later. I may not understand all the optimizations that went into the CLR splitter function so if I did something painfully wrong...

    Unfortunately, it seems to actually be even worse than just using a normal cross join tally table. 2014 assumes 1000 row for sTVFs, which is probably where the pain comes in. For the longer string i'm guessing the under estimate hurts. With the inline approach the correct number of rows are always present, and so the QO can make better guesses.

    That said even just filling temp tables it seems to only be slightly faster than the cross join approach. I'm guessing that there might be some clever way to avoid the boxing/unboxing penalty. I could try using a sealed class reading a readonly field but I can't imagine that would be any faster than a box. I also assume that Enumerable.Range is well optimized, and I don't care about handling null that's already being handled as far as I'm concerned.

  • One thing I discovered is that Jeff's test harness is a bit biased at least when it comes to the CLR functions.

    I was trying to write a faster CLR function or a hybrid function, and I came to an astounding and rather sad result. Which ever function I placed earlier in the set won the round.

    I did this by taking two TVFs pointing to Paul White's CLR assembly he wrote and included in the original script and named them SplitA,SplitB.

    After that I added the following code as a warm up code:

    select * into #tempA from SplitA('a,b,c,',',')

    select * into #tempB from SplitB('a,b,c,',',')

    Drop table #tempA

    drop table #tempB

    Then I ran the test. Which ever splitter I put in first came ahead by about 5 seconds when using the traceflag 8690. I ran this several times to come to this conclusion.

    Does anyone have a better harness that isn't as biased toward multiple CLR splitters? From my testing this doesn't seem to be the case for T-Sql functions. Why do the CLR splitters seem to prefer being earlier in the race?

    Here is my current TestEachProc that uses a functions table to make it a bit easier to test.

    if object_id('dbo.functions','u') is not null drop table dbo.functions;

    CREATE TABLE dbo.functions

    (

    name sysname

    )

    insert into functions(name)

    VALUES ('splitA'),('splitB')

    GO

    if(object_id('testEachFunction') is not null) drop procedure testeachfunction

    GO

    CREATE PROCEDURE dbo.TestEachFunction

    /**********************************************************************************************************************

    Purpose:

    Given the number of rows and elements this testing is for, the stored procedure will test each of the split function

    for duration and record the results in an table called dbo.TestResults in the current DB (which should be TempDB).

    Revision History:

    Rev 01 - 20 May 2014 - Michael Burbea - Modifed to allow for a bit more dynamic code allowing me to test more functions easily.

    Rev 00 - 10 Apr 2011 - Jeff Moden - Initial release for testing

    **********************************************************************************************************************/

    --===== Declare the I/O parameters

    @pNumberOfRows INT,

    @pNumberOfElements INT,

    @pMinElementLength INT,

    @pMaxElementLength INT

    AS

    --=====================================================================================================================

    -- Presets

    --=====================================================================================================================

    --===== Suppress the auto-display of rowcounts for appearance and speed

    SET NOCOUNT ON;

    --===== Declare some obviously named local variables

    DECLARE @StartTime DATETIME,

    @EndTime DATETIME,

    @Message SYSNAME,

    @MinLength INT,

    @AvgLength INT,

    @MaxLength INT;

    --===== Preset and display the current run message

    SELECT @Message = '========== '

    + CAST(@pNumberOfRows AS VARCHAR(10)) + ' Rows, '

    + CAST(@pMinElementLength AS VARCHAR(10)) + ' MinElementSize, '

    + CAST(@pMaxElementLength AS VARCHAR(10)) + ' MaxElementSize, '

    + CAST(@pNumberOfElements AS VARCHAR(10)) + ' Elements '

    + '==========';

    RAISERROR(@Message,10,1) WITH NOWAIT;

    --===== Calculate some statistics for the condition of the data

    SELECT @MinLength = MIN(DATALENGTH(CSV)),

    @AvgLength = AVG(DATALENGTH(CSV)),

    @MaxLength = MAX(DATALENGTH(CSV))

    FROM dbo.Csv8K;

    --select CONCAT('declare @pNumberOfRows INT=100,

    -- @pNumberOfElements INT=10,

    -- @pMinElementLength INT=5,

    -- @pMaxElementLength INT=20,@MinLength int=',@minLength,',@AvgLength int=',@AvgLength,+',@MaxLength int=',@MaxLength)

    --=====================================================================================================================

    -- Run the tests, By generating a dynamic sproc.

    --=====================================================================================================================

    DECLARE @STR NVARCHAR(MAX) ='declare @startTime datetime2,@endTime datetime2;'+(select

    ';RAISERROR(''Testing '+c.name+''',10,1) WITH NOWAIT;

    DBCC DROPCLEANBUFFERS;

    --===== Start the timer

    SELECT @StartTime = sysdatetime();

    --===== Run the test

    SELECT csv.RowNum, split.ItemNumber, split.Item

    INTO #void'+convert(varchar,ROW_NUMBER() over (order by (select NULL)))+'

    FROM dbo.CSV8K csv

    CROSS APPLY dbo.'+c.name+'(csv.CSV,char(44)) split

    --===== Stop the timer and record the test

    select @EndTime= sysdatetime();

    INSERT INTO dbo.TestResults

    (SplitterName, NumberOfRows, NumberOfElements, MinElementLength, MaxElementLength, Duration, MinLength, AvgLength, MaxLength)

    SELECT '''+c.name+''',

    @pNumberOfRows,

    @pNumberOfElements,

    @pMinElementLength,

    @pMaxElementLength,

    DATEDIFF(microsecond,@StartTime,@EndTime)/1e6,

    MinLength = @MinLength,

    AvgLength = @AvgLength,

    MaxLength = @MaxLength

    drop table #void'+convert(varchar,ROW_NUMBER() over (order by (select NULL)))+'

    ;'

    from dbo.functions c

    for xml path(''));

    select @STR=REPLACE(@str,' ','')

    --select @STR

    EXEC SP_EXECUTESQL @STR,N'@pNumberOfRows INT,

    @pNumberOfElements INT,

    @pMinElementLength INT,

    @pMaxElementLength INT,

    @minLength int,

    @avgLength int,

    @maxLength int',@pNumberOfRows=@pNumberOfRows,

    @pNumberOfElements=@pNumberOfElements,

    @pMinElementLength=@pMinElementLength,

    @pMaxElementLength=@pMaxElementLength,

    @minLength=@minLength,

    @avgLength=@avgLength,

    @maxLength=@maxLength

    GO

    Edit:

    It actually might just be a symptom of memory pressure. I can't figure out a good way to recycle the clr app pool on the server (turning on and off clr enabled doesn't do it), so I restarted the sql service. The function that was placed first still won the shootout, but the gap was much less severe. Now down to the hundreds of miliseconds.

    I was trying lots of CLR functions so that could explain why the memory pressure was causing such issues.

  • mburbea (5/20/2014)


    Here is my current TestEachProc that uses a functions table to make it a bit easier to test.

    '

    --===== Run the test

    SELECT csv.RowNum, split.ItemNumber, split.Item

    INTO #void'+convert(varchar,ROW_NUMBER() over (order by (select NULL)))+'

    FROM dbo.CSV8K csv

    CROSS APPLY dbo.'+c.name+'(csv.CSV,char(44)) split

    '

    Edit:

    ...I can't figure out a good way to recycle the clr app pool on the server (turning on and off clr enabled doesn't do it), so I restarted the sql service.

    Hey there. For the test proc, it might help to reduce external factors from potentially influencing some of the results, such as might be happening with the creation and population of the #voidXX temp tables. Since the data in the temp tables is never used, maybe just store the value for each of the three fields in a local variable that gets reassigned per each row? Speaking in terms of the code block that I quoted above, it would be replaced with something like:

    '

    DECLARE @DummyRowNum INT,

    @DummyItemNumber INT,

    @DummyItem VARCHAR(1000)

    --===== Run the test

    SELECT @DummyRowNum = csv.RowNum,

    @DummyItemNumber = split.ItemNumber,

    @DummyItem = split.Item

    FROM dbo.CSV8K csv

    CROSS APPLY dbo.'+c.name+'(csv.CSV,char(44)) split

    '

    And then just remove the following line:

    '

    drop table #void'+convert(varchar,ROW_NUMBER() over (order by (select NULL)))+'

    '

    Regarding recycling the App Domain, here is a little known trick that I stumbled upon while testing something a couple of years ago: ALTER an Assembly in that App Domain to change its PERMISSION_SET (but don't forget to ALTER again to put the PERMISSION_SET back to the original value). This will force an unload of the App Domain as long as at least one method in the Assembly has been executed. 😀

    Take care,

    Solomon...

    SQL#https://SQLsharp.com/ ( SQLCLR library ofover 340 Functions and Procedures)
    Sql Quantum Lifthttps://SqlQuantumLift.com/ ( company )
    Sql Quantum Leaphttps://SqlQuantumLeap.com/ ( blog )
    Info sitesCollations     •     Module Signing     •     SQLCLR

  • Solomon Rutzky (5/20/2014)


    Regarding recycling the App Domain, here is a little known trick that I stumbled upon while testing something a couple of years ago: ALTER an Assembly in that App Domain to change its PERMISSION_SET (but don't forget to ALTER again to put the PERMISSION_SET back to the original value). This will force an unload of the App Domain as long as at least one method in the Assembly has been executed. 😀

    Take care,

    Solomon...

    Thanks that did the trick.

    On the otherhand, How am I sure sql server actually does the work and doesn't just cheat realize that the first 999 rows don't matter, and only processes the last row? (since the sql-clr function is a blackbox). I know its not quite as bad as say a C compiler when it comes to removing code, but I know the query optimizer can be quite clever in its evaluation.

Viewing 15 posts - 616 through 630 (of 981 total)

You must be logged in to reply to this topic. Login to reply