SQL SERVER STALLED

  • We have a Project Bulit in Visual Basic 6.0,SQL Server 2000,We have one Big Problem,We have a something called Reconcillation where we compare the fields or fields of 2 Tables based on the settings ,then it has do some arithmetical calculations before its done.All these comparisions and calculations are done using queries with hardly any loop running in VB.Now The Problem is during this the SQL SERVER becomes 100% busy and rest of the users cant use unless that Recocillation process is complete it may 1/2 hr to 1 hr to complete.

    Please suggest what can be done.

  • Are you sure the Server is 'busy', or do you have a number of locks on the data?

    In the first case, I'm afraid you cannot do a lot, besides from optimising your queries.

    In the last case, you should check the locking mechanism that is used in VB for the connection.

  • MY CPU usage at Server reads 100% when those reconcilation happns and Vb doesnt support multithreading ,is there anyway i can multithread in SQL server

  • I would look into the query execution plan whether proper indexes being used and the number of physical IO occurs during the reconcilation.

  • I would consider maybe looking at converting the logic in the VB app to a stored procedure so that it runs inside SQL Server and can be a job. If you need help post the code here.

  • Changing into Stored Procedure is not a problem,but it is the same query which is going to run in the Server.IS THERE ANY WAY WHERE WE CAN HAVE SIMULTAENOUS QUERIES RUNNING(MULTI THREADING).I mean when i run thsi reconcillation,if a person tries to log in into Application,it doesnt allow and wait until that query gets over.

  • As NPeeters mentioned, look at the default locking done by the VB logic. We had a similar situation with a sproc doing large calculations on a table used for simultaneous reporting. Turned out that the calculation piece was doing a delete on the table with criteria. This placed an exclusive lock on the entire table, thus blocking out everyone else, including select statements.

    While the calculations are happening, try going into Enterprise Manager and viewing management --> current activity --> locks by object. This will show you the current locks on the table in question as well as the types of locks.

    Also, incorporating this logic into a sproc will greatly improve the performance.

    Look at the volume of data affected by this process, too. Perhaps your server is running low on resources (memory, CPU, etc)?

  • [purple]Here are some tips:

    -Don't use sub-queries. Most of the time you can accomplish the same with joins

    -Make sure that you index the columns you use in the WHERE clause

    -Place the query into a stored procedure. This saves SQL Server the overhead of optimizing the query every time it runs. Keep in mind, however, that a poorly constructed query will not improve much even in a stored procedure.

    -When using a stored procedure you can take advantage of temporary tables. Create a temp table that reflects the desired result columns (the columns in the SELECT clause), populate the table with data and update it incrementally, and then return the results of the temp table. When working this way, make sure to turn counting off (Set NoCount On) at the beginning of the stored procedure and turn it back on (Set NoCount Off) just before the final SELECT statement.

  • There's probably one other thing that you could do.

    If you need asyncronous access to the same tables that are being processed so other users can still query them, the queries that are not doing the reconciliation should be specified with nolock. ie

    select * from table1 with (nolock)

    hth

  • As Allen stated earlier, look at the query plan and the general well being of the indexes on the tables. Correct indexes? Need to be defragged/rebuilt? Update statistics?

    We had a circumstance in which an index rebuilding job was disabled and over time the processor usage increased to the point it pegged out at 100%. I guess all of the page traversing was consuming an excessive amount of cpu.

    I don't think locking is a problem. Intuitively, locking would probably reduce the cpu usage because processes are sitting idle waiting for locks to be released - unless of course you have a rowlock hint on a table that is locking an excessive number of rows. In this case, the lock processing may be consuming more resources than your commands.

    Multi-treading on the VB side won't help you - by definition, it would actually cause additional processing on the server side. And besides, multi-threading in VB6 is a real kluge.

    The best way to fix complex and mysterious problems is to isolate the true cause of the problem. You might want to try running the profiler and look at the commands that are consuming the most resouces. Also, turn on statistics i/o and statistics time in your query analyser options and execute the commands. Look at the areas that consume the most resources.

    Hope this helps.

    Bill Bertovich

    bbertovich@eliberation.com


    Bill Bertovich
    bbertovich@interchangeusa.com

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply