How to export dataset into Excel when row limit exceeds 65535

  • Hi Guys,

    I've been searching the forums trying to find the answer to my question but I'm not having much luck. The majority of our clients require that we deliver data to them in .xls formated files because they are behind on upgrading to 2007/2010. We have some data sets returning 200,000k+ rows. Is it possible to have SSIS break the large dataset up into chunks and automatically place it into multiple tabs on the spreadsheets?

    Thanks,

    Code

  • One way to do this is dump the data into a staging table, including a SheetNumber column, break the data up that way, select the distinct SheetNumber values, and loop through them, creating the "tables" (worksheets) in Excel at the beginning of each loop, and then exporting just the records for that sheet.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • Code-1029433 (3/9/2011)


    Hi Guys,

    I've been searching the forums trying to find the answer to my question but I'm not having much luck. The majority of our clients require that we deliver data to them in .xls formated files because they are behind on upgrading to 2007/2010. We have some data sets returning 200,000k+ rows. Is it possible to have SSIS break the large dataset up into chunks and automatically place it into multiple tabs on the spreadsheets?

    Thanks,

    Code

    You could always give them CSV files instead. Excel is the wrong tool for handling so many lines of data. If they are processing this data after you provide it to them, the fact that it's split over multiple tabs is surely just going to slow them down.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Thanks for the suggestions, guys! I'll have our sales team work with the clients to see if they will accept .csv, if so then then we'll go that route, otherwise we'll look into using the first suggestion.

  • Phil Parkin (3/9/2011)


    Code-1029433 (3/9/2011)


    Hi Guys,

    I've been searching the forums trying to find the answer to my question but I'm not having much luck. The majority of our clients require that we deliver data to them in .xls formated files because they are behind on upgrading to 2007/2010. We have some data sets returning 200,000k+ rows. Is it possible to have SSIS break the large dataset up into chunks and automatically place it into multiple tabs on the spreadsheets?

    Thanks,

    Code

    You could always give them CSV files instead. Excel is the wrong tool for handling so many lines of data. If they are processing this data after you provide it to them, the fact that it's split over multiple tabs is surely just going to slow them down.

    CSV will also be smaller files (though still huge with that many rows).

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • Phil Parkin (3/9/2011)


    You could always give them CSV files instead. Excel is the wrong tool for handling so many lines of data. If they are processing this data after you provide it to them, the fact that it's split over multiple tabs is surely just going to slow them down.

    Second that. Excel is a spreadsheet application, not a data transfer format.

    When will the world know?!? *cries in despair*

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Phil Parkin (3/9/2011)


    You could always give them CSV files instead. Excel is the wrong tool for handling so many lines of data. If they are processing this data after you provide it to them, the fact that it's split over multiple tabs is surely just going to slow them down.

    Unless they're trying to use excel to manipulate the data after it's been exported to them, sigh.....

    You could try using a conditional split in the dataflow task to redirect some of the rows to other tabs. You'd need to have some way to specify groups of rows to split out of the data set.

  • Here is an article, complete with sample code that may provide the T-SQL code you are searching for:

    http://www.mssqltips.com/tip.asp?tip=1202

    Additional data replete with many, many code samples

    http://www.simple-talk.com/sql/t-sql-programming/sql-server-excel-workbench/

    Another possible method, have Excel pull the data from the DB.

    http://qa.sqlservercentral.com/articles/Stored+Procedures/63537/

    If everything seems to be going well, you have obviously overlooked something.

    Ron

    Please help us, help you -before posting a question please read[/url]
    Before posting a performance problem please read[/url]

  • Unless they're trying to use excel to manipulate the data after it's been exported to them, sigh.....

    Brings to mind a whole batch of accountants doing multi-tab analyses very very slowly and knowing that there must be a better way...

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Phil Parkin (3/10/2011)


    Brings to mind a whole batch of accountants doing multi-tab analyses very very slowly and knowing that there must be a better way...

    Oh but they do (or at least they ought to)... there have been so many instances where they are told not to do VLOOKUP on that many rows, do it in a database, or themes along those lines...

    Sadly many accountants are totally wedded to Excel. And then MS in their infinite wisdom removed the 65535 row limit in a worksheet, resetting the number over the million mark (1048576 rows) in Office 2007. Trust me, some accountant somewhere will be religiously hitting that limit too, in their download of GL data from <insert package of choice here>, where they then sort the result, match and reconcile.:-D

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply