Huge number of tables

  • Hi all,

    I've inherited a situation where we have a business critical system with gigantic number of tables, 240,000 - yes, nearly a quarter of a million. 

    I'll investigate further and update this post but my guess for now is that these comprise either the vendor or other parties periodically inserting data via Select Into and then should have deleted/archived elderly data, but didn't .

    Would like appreciate any/all input.

  • This might come in handy:


    SELECT  [TableName] = OBJECT_NAME(object_id),
    last_user_update, last_user_seek, last_user_scan, last_user_lookup
    FROM  sys.dm_db_index_usage_stats
    WHERE  database_id = DB_ID('MyDB')

    See: https://sqlandme.com/2014/03/11/sql-server-how-to-get-last-accessupdate-time-for-a-table/

  • JackCarrington - Wednesday, August 8, 2018 7:39 AM

    Hi all,

    I've inherited a situation where we have a business critical system with gigantic number of tables, 240,000 - yes, nearly a quarter of a million. 

    I'll investigate further and update this post but my guess for now is that these comprise either the vendor or other parties periodically inserting data via Select Into and then should have deleted/archived elderly data, but didn't .

    Would like appreciate any/all input.

    Run away as fast as you can!
    You could actually try to identify which tables are not being used and start cleaning as fast as possible. Hopefully, there's at least a decent naming system (clear names instead of random or semi-random fixed length strings)

    Luis C.
    General Disclaimer:
    Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

    How to post data/code on a forum to get the best help: Option 1 / Option 2
  • Luis Cazares - Wednesday, August 8, 2018 9:11 AM

    JackCarrington - Wednesday, August 8, 2018 7:39 AM

    Hi all,

    I've inherited a situation where we have a business critical system with gigantic number of tables, 240,000 - yes, nearly a quarter of a million. 

    I'll investigate further and update this post but my guess for now is that these comprise either the vendor or other parties periodically inserting data via Select Into and then should have deleted/archived elderly data, but didn't .

    Would like appreciate any/all input.

    Run away as fast as you can!
    You could actually try to identify which tables are not being used and start cleaning as fast as possible. Hopefully, there's at least a decent naming system (clear names instead of random or semi-random fixed length strings)

    The table names themselves provide a clue....
    There are a few with fairly banale names...
    EVENTPRICEAUDIT
    EVENTPRICEBENEFIT
    EVENTPRICEBENEFITAUDIT etc...  BUT

    You have literally hundreds (if not thousands) like

    MKTSEGMENTATIONEXPORT_0000dd56_af21_4b45_90df_a9300397d9af
    MKTSEGMENTATIONEXPORT_00018d8e_35cc_499c_b116_b4b25c24b7b9
    MKTSEGMENTATIONEXPORT_0001e727_8e3d_418b_9cc7_be010eeecc19
    MKTSEGMENTATIONEXPORT_0003d4d5_c3e8_435e_955f_041be5571f10

    Or

    EXPORT_96d02641_4cd0_4523_8581_0e934a18d990
    EXPORT_96d3554a_6401_4eb7_80ea_c63ef3e43241
    EXPORT_96d56ad3_e115_4cc2_9a1e_e358ef74f2fc
    EXPORT_96d5a9a9_5c5b_4f11_9b2a_b4e6df937f7c
    EXPORT_96d613cd_bf96_41fe_84e8_8b807061b5f0

    You'll note each table has a meaningful name succeeded by a string of what appear to be system-generated characters, separated in ALL cases by 5 underscores.

    What I need is a script that finds all such tables.  Here's what I've used so far...

    DBCC FREEPROCCACHE

    DBCC DROPCLEANBUFFERS

    SELECT [TableName] = OBJECT_NAME(object_id) --, Count(*) As 'QTY' --int

    From sys.tables

    Group by (object_id)

    --Having 'QTY' > 1

    Order by OBJECT_NAME(object_id)

  • JackCarrington - Friday, August 10, 2018 9:06 AM

    Luis Cazares - Wednesday, August 8, 2018 9:11 AM

    JackCarrington - Wednesday, August 8, 2018 7:39 AM

    Hi all,

    I've inherited a situation where we have a business critical system with gigantic number of tables, 240,000 - yes, nearly a quarter of a million. 

    I'll investigate further and update this post but my guess for now is that these comprise either the vendor or other parties periodically inserting data via Select Into and then should have deleted/archived elderly data, but didn't .

    Would like appreciate any/all input.

    Run away as fast as you can!
    You could actually try to identify which tables are not being used and start cleaning as fast as possible. Hopefully, there's at least a decent naming system (clear names instead of random or semi-random fixed length strings)

    The table names themselves provide a clue....
    There are a few with fairly banale names...
    EVENTPRICEAUDIT
    EVENTPRICEBENEFIT
    EVENTPRICEBENEFITAUDIT etc...  BUT

    You have literally hundreds (if not thousands) like

    MKTSEGMENTATIONEXPORT_0000dd56_af21_4b45_90df_a9300397d9af
    MKTSEGMENTATIONEXPORT_00018d8e_35cc_499c_b116_b4b25c24b7b9
    MKTSEGMENTATIONEXPORT_0001e727_8e3d_418b_9cc7_be010eeecc19
    MKTSEGMENTATIONEXPORT_0003d4d5_c3e8_435e_955f_041be5571f10

    Or

    EXPORT_96d02641_4cd0_4523_8581_0e934a18d990
    EXPORT_96d3554a_6401_4eb7_80ea_c63ef3e43241
    EXPORT_96d56ad3_e115_4cc2_9a1e_e358ef74f2fc
    EXPORT_96d5a9a9_5c5b_4f11_9b2a_b4e6df937f7c
    EXPORT_96d613cd_bf96_41fe_84e8_8b807061b5f0

    You'll note each table has a meaningful name succeeded by a string of what appear to be system-generated characters, separated in ALL cases by 5 underscores.

    What I need is a script that finds all such tables.  Here's what I've used so far...

    DBCC FREEPROCCACHE

    DBCC DROPCLEANBUFFERS

    SELECT [TableName] = OBJECT_NAME(object_id) --, Count(*) As 'QTY' --int

    From sys.tables

    Group by (object_id)

    --Having 'QTY' > 1

    Order by OBJECT_NAME(object_id)

    That system generated string looks like GUIDs with the dashes changed to underscores.
    Looks like quite a few of those can be dropped.

  • Lynn Pettis - Friday, August 10, 2018 9:13 AM

    JackCarrington - Friday, August 10, 2018 9:06 AM

    Luis Cazares - Wednesday, August 8, 2018 9:11 AM

    JackCarrington - Wednesday, August 8, 2018 7:39 AM

    Hi all,

    I've inherited a situation where we have a business critical system with gigantic number of tables, 240,000 - yes, nearly a quarter of a million. 

    I'll investigate further and update this post but my guess for now is that these comprise either the vendor or other parties periodically inserting data via Select Into and then should have deleted/archived elderly data, but didn't .

    Would like appreciate any/all input.

    Run away as fast as you can!
    You could actually try to identify which tables are not being used and start cleaning as fast as possible. Hopefully, there's at least a decent naming system (clear names instead of random or semi-random fixed length strings)

    The table names themselves provide a clue....
    There are a few with fairly banale names...
    EVENTPRICEAUDIT
    EVENTPRICEBENEFIT
    EVENTPRICEBENEFITAUDIT etc...  BUT

    You have literally hundreds (if not thousands) like

    MKTSEGMENTATIONEXPORT_0000dd56_af21_4b45_90df_a9300397d9af
    MKTSEGMENTATIONEXPORT_00018d8e_35cc_499c_b116_b4b25c24b7b9
    MKTSEGMENTATIONEXPORT_0001e727_8e3d_418b_9cc7_be010eeecc19
    MKTSEGMENTATIONEXPORT_0003d4d5_c3e8_435e_955f_041be5571f10

    Or

    EXPORT_96d02641_4cd0_4523_8581_0e934a18d990
    EXPORT_96d3554a_6401_4eb7_80ea_c63ef3e43241
    EXPORT_96d56ad3_e115_4cc2_9a1e_e358ef74f2fc
    EXPORT_96d5a9a9_5c5b_4f11_9b2a_b4e6df937f7c
    EXPORT_96d613cd_bf96_41fe_84e8_8b807061b5f0

    You'll note each table has a meaningful name succeeded by a string of what appear to be system-generated characters, separated in ALL cases by 5 underscores.

    What I need is a script that finds all such tables.  Here's what I've used so far...

    DBCC FREEPROCCACHE

    DBCC DROPCLEANBUFFERS

    SELECT [TableName] = OBJECT_NAME(object_id) --, Count(*) As 'QTY' --int

    From sys.tables

    Group by (object_id)

    --Having 'QTY' > 1

    Order by OBJECT_NAME(object_id)

    That system generated string looks like GUIDs with the dashes changed to underscores.
    Looks like quite a few of those can be dropped.

    Sounds like they have some process that is generating staging tables or something then not cleaning them up.  Unfortunately since they used guids instead of date stamps or something useful it might be a pain to figure out what to actually remove, and if that process is being run regularly it'll just keep creating tables.

    Maybe try running a trace or something and see what's creating those tables?

  • ZZartin - Friday, August 10, 2018 11:22 AM

    Lynn Pettis - Friday, August 10, 2018 9:13 AM

    JackCarrington - Friday, August 10, 2018 9:06 AM

    Luis Cazares - Wednesday, August 8, 2018 9:11 AM

    JackCarrington - Wednesday, August 8, 2018 7:39 AM

    Hi all,

    I've inherited a situation where we have a business critical system with gigantic number of tables, 240,000 - yes, nearly a quarter of a million. 

    I'll investigate further and update this post but my guess for now is that these comprise either the vendor or other parties periodically inserting data via Select Into and then should have deleted/archived elderly data, but didn't .

    Would like appreciate any/all input.

    Run away as fast as you can!
    You could actually try to identify which tables are not being used and start cleaning as fast as possible. Hopefully, there's at least a decent naming system (clear names instead of random or semi-random fixed length strings)

    The table names themselves provide a clue....
    There are a few with fairly banale names...
    EVENTPRICEAUDIT
    EVENTPRICEBENEFIT
    EVENTPRICEBENEFITAUDIT etc...  BUT

    You have literally hundreds (if not thousands) like

    MKTSEGMENTATIONEXPORT_0000dd56_af21_4b45_90df_a9300397d9af
    MKTSEGMENTATIONEXPORT_00018d8e_35cc_499c_b116_b4b25c24b7b9
    MKTSEGMENTATIONEXPORT_0001e727_8e3d_418b_9cc7_be010eeecc19
    MKTSEGMENTATIONEXPORT_0003d4d5_c3e8_435e_955f_041be5571f10

    Or

    EXPORT_96d02641_4cd0_4523_8581_0e934a18d990
    EXPORT_96d3554a_6401_4eb7_80ea_c63ef3e43241
    EXPORT_96d56ad3_e115_4cc2_9a1e_e358ef74f2fc
    EXPORT_96d5a9a9_5c5b_4f11_9b2a_b4e6df937f7c
    EXPORT_96d613cd_bf96_41fe_84e8_8b807061b5f0

    You'll note each table has a meaningful name succeeded by a string of what appear to be system-generated characters, separated in ALL cases by 5 underscores.

    What I need is a script that finds all such tables.  Here's what I've used so far...

    DBCC FREEPROCCACHE

    DBCC DROPCLEANBUFFERS

    SELECT [TableName] = OBJECT_NAME(object_id) --, Count(*) As 'QTY' --int

    From sys.tables

    Group by (object_id)

    --Having 'QTY' > 1

    Order by OBJECT_NAME(object_id)

    That system generated string looks like GUIDs with the dashes changed to underscores.
    Looks like quite a few of those can be dropped.

    Sounds like they have some process that is generating staging tables or something then not cleaning them up.  Unfortunately since they used guids instead of date stamps or something useful it might be a pain to figure out what to actually remove, and if that process is being run regularly it'll just keep creating tables.

    Maybe try running a trace or something and see what's creating those tables?

    Use the created date in the sys.tables system view.

  • Lynn Pettis - Friday, August 10, 2018 11:24 AM

    ZZartin - Friday, August 10, 2018 11:22 AM

    Lynn Pettis - Friday, August 10, 2018 9:13 AM

    That system generated string looks like GUIDs with the dashes changed to underscores.
    Looks like quite a few of those can be dropped.

    Sounds like they have some process that is generating staging tables or something then not cleaning them up.  Unfortunately since they used guids instead of date stamps or something useful it might be a pain to figure out what to actually remove, and if that process is being run regularly it'll just keep creating tables.

    Maybe try running a trace or something and see what's creating those tables?

    Use the created date in the sys.tables system view.

    Along with the last_user_seek, last_user_scan, last_user_lookup & last_user_update columns from sys.dm_db_index_usage_stats. Note that these might not be the actual dates as they can be reset.

    Luis C.
    General Disclaimer:
    Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

    How to post data/code on a forum to get the best help: Option 1 / Option 2
  • JackCarrington - Friday, August 10, 2018 9:06 AM

    Luis Cazares - Wednesday, August 8, 2018 9:11 AM

    JackCarrington - Wednesday, August 8, 2018 7:39 AM

    Hi all,

    I've inherited a situation where we have a business critical system with gigantic number of tables, 240,000 - yes, nearly a quarter of a million. 

    I'll investigate further and update this post but my guess for now is that these comprise either the vendor or other parties periodically inserting data via Select Into and then should have deleted/archived elderly data, but didn't .

    Would like appreciate any/all input.

    Run away as fast as you can!
    You could actually try to identify which tables are not being used and start cleaning as fast as possible. Hopefully, there's at least a decent naming system (clear names instead of random or semi-random fixed length strings)

    The table names themselves provide a clue....
    There are a few with fairly banale names...
    EVENTPRICEAUDIT
    EVENTPRICEBENEFIT
    EVENTPRICEBENEFITAUDIT etc...  BUT

    You have literally hundreds (if not thousands) like

    MKTSEGMENTATIONEXPORT_0000dd56_af21_4b45_90df_a9300397d9af
    MKTSEGMENTATIONEXPORT_00018d8e_35cc_499c_b116_b4b25c24b7b9
    MKTSEGMENTATIONEXPORT_0001e727_8e3d_418b_9cc7_be010eeecc19
    MKTSEGMENTATIONEXPORT_0003d4d5_c3e8_435e_955f_041be5571f10

    Or

    EXPORT_96d02641_4cd0_4523_8581_0e934a18d990
    EXPORT_96d3554a_6401_4eb7_80ea_c63ef3e43241
    EXPORT_96d56ad3_e115_4cc2_9a1e_e358ef74f2fc
    EXPORT_96d5a9a9_5c5b_4f11_9b2a_b4e6df937f7c
    EXPORT_96d613cd_bf96_41fe_84e8_8b807061b5f0

    You'll note each table has a meaningful name succeeded by a string of what appear to be system-generated characters, separated in ALL cases by 5 underscores.

    What I need is a script that finds all such tables.  Here's what I've used so far...

    DBCC FREEPROCCACHE

    DBCC DROPCLEANBUFFERS

    SELECT [TableName] = OBJECT_NAME(object_id) --, Count(*) As 'QTY' --int

    From sys.tables

    Group by (object_id)

    --Having 'QTY' > 1

    Order by OBJECT_NAME(object_id)

    I've seen this many times, reports/extracts/queries are persisted and not cleaned up, time for a spring cleaning!
    😎
    My advice, take a full backup, drop anything that looks like a persisted query and don't answer the phone for few days😀

  • Rather than dropping tables that you may have to restore when the phone does ring, rename them with a prefix of "ToBeDeleted_yyyymmdd_" where "yyyymmdd" is a date that's 5 weeks out from the current date.  Build a "sweeper" that scans for anything that has a date less than the current date in the table name and then only drop those. 

    If the phone does ring, it's a whole lot easier to simply remove the prefix from the table name than it is to do a restore to restore a single or even a couple of tables.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Jeff Moden - Saturday, August 11, 2018 10:52 AM

    Rather than dropping tables that you may have to restore when the phone does ring, rename them with a prefix of "ToBeDeleted_yyyymmdd_" where "yyyymmdd" is a date that's 5 weeks out from the current date.  Build a "sweeper" that scans for anything that has a date less than the current date in the table name and then only drop those. 

    If the phone does ring, it's a whole lot easier to simply remove the prefix from the table name than it is to do a restore to restore a single or even a couple of tables.

    Good point on the renaming Jeff, you are much kinder than I!
    😎

    Having gone through this kind of things few times, my question would always be "so what was the schema and table name again?" When work areas spill into the production, sometimes one just has to do a proper spring/summer/autumn/winter cleaning.

  • Luis Cazares - Wednesday, August 8, 2018 9:11 AM

    JackCarrington - Wednesday, August 8, 2018 7:39 AM

    Hi all,

    I've inherited a situation where we have a business critical system with gigantic number of tables, 240,000 - yes, nearly a quarter of a million. 

    I'll investigate further and update this post but my guess for now is that these comprise either the vendor or other parties periodically inserting data via Select Into and then should have deleted/archived elderly data, but didn't .

    Would like appreciate any/all input.

    Run away as fast as you can!
    You could actually try to identify which tables are not being used and start cleaning as fast as possible. Hopefully, there's at least a decent naming system (clear names instead of random or semi-random fixed length strings)

    +100

    ...

  • Eirikur Eiriksson - Saturday, August 11, 2018 11:04 AM

    Jeff Moden - Saturday, August 11, 2018 10:52 AM

    Rather than dropping tables that you may have to restore when the phone does ring, rename them with a prefix of "ToBeDeleted_yyyymmdd_" where "yyyymmdd" is a date that's 5 weeks out from the current date.  Build a "sweeper" that scans for anything that has a date less than the current date in the table name and then only drop those. 

    If the phone does ring, it's a whole lot easier to simply remove the prefix from the table name than it is to do a restore to restore a single or even a couple of tables.

    Good point on the renaming Jeff, you are much kinder than I!
    😎

    Having gone through this kind of things few times, my question would always be "so what was the schema and table name again?" When work areas spill into the production, sometimes one just has to do a proper spring/summer/autumn/winter cleaning.

    Heh.... you are seriously mistaken.  I don't do it to be kind.... I do it so that I'll have less work to do when the phone rings. 😀

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Luis Cazares - Wednesday, August 8, 2018 9:11 AM

    JackCarrington - Wednesday, August 8, 2018 7:39 AM

    Hi all,

    I've inherited a situation where we have a business critical system with gigantic number of tables, 240,000 - yes, nearly a quarter of a million. 

    I'll investigate further and update this post but my guess for now is that these comprise either the vendor or other parties periodically inserting data via Select Into and then should have deleted/archived elderly data, but didn't .

    Would like appreciate any/all input.

    Run away as fast as you can!
    You could actually try to identify which tables are not being used and start cleaning as fast as possible. Hopefully, there's at least a decent naming system (clear names instead of random or semi-random fixed length strings)

    BWAAA-HAAAA!!!!   That would be my first reaction, as well BUT... people need to get over that because it's a real easy place to shine and prove your worth early on in the game, especially if you use the "rename it to mark it for deletion" method.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Jeff Moden - Saturday, August 11, 2018 1:13 PM

    Luis Cazares - Wednesday, August 8, 2018 9:11 AM

    JackCarrington - Wednesday, August 8, 2018 7:39 AM

    Hi all,

    I've inherited a situation where we have a business critical system with gigantic number of tables, 240,000 - yes, nearly a quarter of a million. 

    I'll investigate further and update this post but my guess for now is that these comprise either the vendor or other parties periodically inserting data via Select Into and then should have deleted/archived elderly data, but didn't .

    Would like appreciate any/all input.

    Run away as fast as you can!
    You could actually try to identify which tables are not being used and start cleaning as fast as possible. Hopefully, there's at least a decent naming system (clear names instead of random or semi-random fixed length strings)

    BWAAA-HAAAA!!!!   That would be my first reaction, as well BUT... people need to get over that because it's a real easy place to shine and prove your worth early on in the game, especially if you use the "rename it to mark it for deletion" method.

    Indeed it is and that's one reason I'm doing this (the other being I frickken LOVE playing detective!!).

    Jeff that was a sterling suggestion sir, thank you!!

Viewing 15 posts - 1 through 15 (of 29 total)

You must be logged in to reply to this topic. Login to reply