my tables are too big too query!

  • i'm maintaining a sql database that i designed and built, and running an Access front end interface, also which i built.  i've been doing this stuff for years, and personally, i've long thought that Access is not the ideal tool for the job.  unless i'm missing something pretty significant, then it appears that Access's performance as a front-end for a large SQL database diminishes hugely as the size of the database increases.

    anyway- there is this one table of approximately 3,000,000 records which is used in nearly every query, as it contains crucial demographic data which is almost always relevant to whatever data-pooling i or anybody else is doing.  this table, and all of the other ones, where appropriate, are indexed; yet i pretty much cannot query this database anymore, if the big huge table in question is used.  if i query on it in SQL Query Analyzer, it will take 5 minutes to return the most basic set of data you can imagine, 15 minutes or so for a query with two tables (where one of them is this big one) and a one-to-one relationship in it, a half an hour for a query that does any calculating at all, and pretty much an hour for 95% of the queries i run.  when this huge table is not involved, queries run in no time.

    expectedly, the problem is grotesquely magnified when i'm running Access queries on these tables, from the front end.  problem is, that's the program the users know; and so that's where the queries are being written and used.  i don't have the time or resources to train people to write SQL code, or to convert their Access queries to SQL and run them for them; and as a result, nearly all of the queries that people write (even the ones that i proofread for efficiency) hang up the database and never complete, causing users to End Task.  this doesn't make me look so good as a DB, either.  my users feel like they cannot access the data that they need, and it's causing me grief at work.  not to mention a sense of personal dissatisfaction.

    i've tried all kinds of ways to trim down the size of this huge table, such as creating a View which filters out some records that aren't likely to be needed- but this is an inexact science at best, and sometimes produces inconvenient results, such as a given record being unavailable because it was filtered out by my close-but-not-perfect filtering system, causing the user, who knows the record should be there, to think my database is totally haphazard. 

    also i've tried splitting the table into smaller tables, and having the query run comparisons to figure out which sub-table to use as its source; but that causes problems when one wants to pull full summaries from all of the data in the original table.

    blah blah blah.  there's gotta be some way to query a table of 3 million records efficiently.  i mean, there are companies with way more than 3 million records who need efficient querying- they can't all have this problem.

    anyone?

    thanks

     

  • hey deek

    it sounds like you have tried some different workarounds, but i did precisely the same type of query tool for users and it worked ok - not lightning, but pretty fast.

    it was Access 97 then 2k vs SQL 7.

    a few questions ;

    you say that the query analyzer runs slowly too - where are the access queries hanging ?

    does the data that the users are querying have to be intra-day up to date ? 

    i found that a lot of the queries that the users were running they knew that the data wouldnt have changed that day, so could have indexed summary tables (collections of the occurrences of index values from the main table - regenerated nightly) which are linked to the main table via the index fields but the criteria is set against the summary table.  if they want bang up to date data they can query the main table directory with the known time caveats.

    access vs large sql server is really pretty good for what you get but isnt as efficient makes big demands on the a) user pcs ; more memory, more paging file, faster network cards

    b)network topography; we used a fast switch for the SQL server.

    c) SQL Server.  since your query analyzer queries are also really slow, is the current hardware up to the job. does the server box have a fast network card ?

    it sounds like they are mostly querying the same table - if they have similar queries, or are querying similar resultsets, you might want to create and distribute access pass through queries.  these work pretty quickly (access takes a hands off approach), and can be based on sps.  the only problem i had was where the pass through needed criteria changing.  you can do this dynamically through the querydef object definition, but this seemed a little buggy.  it may have improved subsequently, and you end up with effectively a true client server query and access resultset.

     

    hope this helps a bit.

    it is possible !!!

     

  • If you build the queries from an Access MDB with linked tables to SQL Server there is one important thing to know: some complex queries may get executed on the client, not on the server (while the majority of simple queries are simply passed on to SQL Server to execute). This happens because Access cannot translate the query from the Access SQL dialect into SQL Server SQL dialect.

    Here are some cases when this may happen (from Access Help):

    http://office.microsoft.com/assistance/preview.aspx?AssetID=HP051874531033&CTT=4&Origin=CH062526191033

    (under the "Improve performance of tables linked to tables in a SQL Server database" topic"). The most important cases are:

    - many left joins

    - joins based on a group by

    You can diagnose if this happens in one of two ways:

    a) use SQL Server Profiler to see if there are multiple queries executed for only one Access query.

    b) if the SQL Server is on another machine, look for intense network activity in the "send" direction (not "receive").

    The solution would be to store the queries in the server (as views) or use pass-trough queries.

    Razvan

  • Please ignore my last post. I just saw that you have problems also when you run the queries from the SQL Query Analyser, so it is not an MS Access issue.

    If the big table is usually involved in a GROUP BY with some COUNT-s or SUM-s, I suggest that you store the grouped data in another table(s), maintained using triggers.

    Razvan

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply