Bug in exporting large chunks of data to flat files

  • Hi,

    I need to export the data from my SQL server in JSON format (so that they can imported to a graph database).

    When I have few data, this is no problem, I can even run the query and copy paste the result. But this works only when the output file is roughly less than 4MB.  When I have large dumps of data - in some instances can be up to 100MB, I am using the export technique from the SQL server.

    You can find the documentation in the answer here:

    https://stackoverflow.com/questions/952247/sql-server-truncation-and-8192-limitation

    For the case of JSONs you just have to tick the Unicode box.

    jsonbug1

    Then you add your query:

    jsonbug2

    And finally you chose your delimiters:

    jsonbug3

    The problem is that when the size of the dump is too large, the dump is corrupted. You get odd empty lines etc. see:

    jsonbug4

    At times, in very large files (over 30MB) splits/new lines appear after the opening quotes of properties/attributes etc.

    Has anyone here came across this issue?

    How can we inform Microsoft about it in order to provide a quick resolution? I mean writing to forums or opening a case will it solve the issue in a reasonable amount of time?

    Any other approach in exporting JSON data?

    Thank you!

     

  • Hello dimitrisv

    The reference quoted above states that SSIS was not an option.   Is that so in your case?  SSIS is something to try.  In the Import/Export Wizard, choose to save the export as an SSIS Package, creating a .dtsx package.  The SSIS package can then be run from SQL Agent on the server itself.  This would likely circumvent SSMS restrictions as discussed in the reference.

    If you have, say, a 'Visual Studio Professional' subscription then there are 2 Microsoft technical support incidents included.  However, the problem as described doesn't sound easy to reproduce.  If so, this is likely to be problematic for both forum respondents and Microsoft tech support.  Can you pinpoint a problem with specific entities in your data?

    All the best.

  • Thank you Brendon. I have saved them as packages, have run them directly from the SSIS and still the same issue.

    What I will do is to export the package from  SSIS, open it in Visual Studio and Run it from there.

    If that fails I will write it as an SSIS package from scratch in Visual Studio, without following the wizard.

    Last will check if any nested queries (that provide various properties for the JSON are not optimized and will try to convert everything either as transactions or go the other way round and go for optimistic concurrency in the SQL).

    I am also just wondering if it is just a bug due to high transfer rates of my kit (over 10GB/s) seq read rate.

  • The issue seemed to be originating from the default datatype in the Unicode choice of the SSIS package wizard.

    For details please look here 

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply