Simulating memory pressure

  • "Creating a bigger test database will also work, but involves a lot more work, will cause your tests to run longer, and might require you to clean up your collection of holiday pictures."

    Personally, I don't care for any of the possible answers.

    I've never had to cut down a 1TB database to 1GB for testing. Why would that be more work than dropping it to 5GB? I wouldn't recommend keeping so many personal pictures on your company's computer. I also wouldn't recommend doing this kind of performance testing on a laptop or desktop.

    You want your testing environment to match your production environment as closely as possible. Move ALL the data back down to your dev or test server and rip into it. If you don't have the room to place it then there is a good chance you don't have a disaster recovery plan in place.

    David

  • I think this is more or less the outcome of the discussion so far, but I think a combination of limiting the server memory and DBCC FREEPROCCACHE before executing the procedure would be the way forward to cover both the "memory pressure" and the "fair comparison" elements of the question.

    Note that a CHECKPOINT immediately before the performance test is recommended to also write the dirty

    buffers to disk.

  • Maybe use all of this:

    1) checkpoint

    2) dbcc dropcleanbuffers

    3) dbcc freeproccache

    4) dbcc freesystemcache('all')

    (The last one N4 probably is the same as N2 + N3).

    Using tests for the whole TB database may be very time-consuming even when you have the whole DB available. This should be done after the first approach with reduced in size DB.

  • About question: Simulating memory pressure

    You have been asked to optimize a stored procedure that runs against a terabyte-sized database. The stored procedure executes several steps consecutively. The performance problems appear to be mainly I/O related.

    You install a severely trimmed down test version of the database (1 GB in size) on your desktop computer running SQL Server Developer Edition. Before you start optimizing, you want to establish a baseline by timing the stored procedure on your development machine, so that you can later compare performance after adding indexes and tweaking code.

    However, your desktop has 2 GB of memory installed, and you are concerned that the performance test results may be skewed because the test version of the database fits entirely in cache. What is the best way to simulate the production circumstances as closely as possible?

    I'm don't understand, someone prune my explicate? Thank you 🙁 that question is very hard for me! 🙁 i'm novice in SQL Server

  • lucassouzace,

    You should read the previous posts in this thread. A number of people thought the 'correct' answer was wrong or the question did not have a single answer.

    I am happy to accept that the 'correct' answer will simulate memory pressure, but I am also certain the 'correct' answer is not the right way to troubleshoot a problem with a 1TB table.

    Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.

    When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara

  • EdVassie (9/1/2008)


    I don't think that the question gives enough detail to say that limiting SQL Server memory to 512 MB is the right answer.

    Pretty sure - you're right and of course 512 Limit should be too much.

    Assuming that you have at least 64 GB for a 1 TB Database (usually more like 256 GB if budget appears) a DB shrink of 1:1000 should also shrink the mem limit by 1:1000 ==> 64 MB to 256 MB could be a useful limit

    If you are trying to tune a query against a TB-sized object you must ensure your test environment can recreate the conditions experienced by your full-sized database. Queries against a 1 GB object will behave differently to queries against a 1TB object.

    Usually you got much different query plans (depending one your query) And of course - if you don't have a 1 TB data warehouse table - you will have some joins - sorry but this question and the answers are misleading at all. (even DROPCLEANBUFFERS will simulate hard disc access BUT you have a slightly different storage subsystem and your desk and on the server

    ... we was doing a lot of tests with large databases also for development shrinking 10 times and 100 times. Try it out you 'll wonder about the results. And then - if you're sure what to do - simply go to another machine (from 4 Core to 8 Core and from 8 core to 16 Core, from 8 GB RAM to 12 GB to 24 to 32 GB) you'll pretty sure wondering again. And do not forget - change from INTEL to AMD platform and you'll see new versions

    So, if you simply dive into a 2GB desktop machine and hope to tune problems found in a TB-sized object, your solutions are at best suspect and at worst counterproductive.

    Highly agree with this last sentence. (But the a other written before - the question was simulating memory pressure - so I suggest 512 MB is to much to achive this We cannot tune a database here in the forum :cool:)

  • I don't like any of the answers offered. The one I think comes nearest to being sensible is to increase the test DB size (or actually to not make such a stupidly small test db in the first place), but as has already been said a much better solution is to use a full size DB with a test server capable of handling it. Anyway, a stored proc that is perhaps handling a significant proportion of a terabyte of data with a shortage of memory is going to be soing a good deal more than 1GB of IO, and there's probably no imaginable way that the same SP handling a similar proportion of 1GB with 0.5GB available is going to be handling anything like the same amount of II, so reducing the store available to SQL Server on the test machine to 0.5GB is not going to give you a clue as to the performance of the production system on the full size DB, so I don't at all like the "correct" answer.

    Tom

  • Tom, thanks for your comment. I mostly agree with what you say.

    This was one of my first questions here, and not one I am still very proud of. The idea was to educate people about a possible use of the max server memory setting that I had come across while preparing a talk about query optimization, where I wanted to demonstrate cached vs non-cached IO without having to create an immense test database (because I was running out of free hard disk space).

    But rereading the question and answer options now, I must admit that I somewhat disagree with myself now. 😀


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/

Viewing 8 posts - 16 through 22 (of 22 total)

You must be logged in to reply to this topic. Login to reply