Should I use XML?

  • Hi all,

    I'm about to get involved in designing a database for an internet solution for GPS tracking of vehicles.

    What I'm wandering is should I use XML?  In the past I used COM+ with ADO but found that to be disappointing. 

    To give you an idea of the amount of data to be inserted and searched upon

    4 positions a hour x 10 hours = 40 per vehicles x 1000 vehicles for the first = 40,000 during 8AM to 8PM

    At night historical data is downloaded from the units

    1000 vehicles x 60 logs per hour x 10 hours = 600,000 logs to be entered during 8PM and 8AM without effecting the speed of the database and web solution.

    Do you see my problem!  So I'm wandering if mass insertion using XML would be the best way.

    Jim


    Kindest Regards,

    Jim

  • Jim,

    As far as I know the GPS devices are sending their data in a row format what you have to transalate first to XML and the import to SQLServer.

    I wouldn't use XML for that kind of data loading. BCP or DTS is able to make a fairly good job (and also the fastest). There is one faster way to load data from files is when you write your own spaecialised BCP app using the ODBC BCP API.

    Based on my experience, a dedicated BCP API based APP is 2-3 times faster the DTS or BCP "off the shelf"

    XML is fine when you want to exchange informations between automated apps but this is not the fastest way to communicate especially when none of the apps are speaking natively XML



    Bye
    Gabor

  • BCP?  I'm don't know term

    So I take it that you would use XML for interfacing with a web site?

    Jim


    Kindest Regards,

    Jim

  • BCP = Bulk Copy

    This is an external exe file (coming with SQLServer. You can find it in the C:\Program Files\Microsoft SQL Server\80\Tools\Binn in a standard installation. More or less the same can be accomplished with DTS (Data Transformation Service) coming with SQLServer as well.

    XML is great for data exchange between applications (not necessairly web) where complex data (or data where you don't want to write a specific interface program for each and every application) are to be exchanged.

    But all this of cours at a price of performance.



    Bye
    Gabor

  • I need a high performance - initially 14 entried per second.

    Year on 70 entries per second.

    I heard I could look at writing a trans/comment state that would do it but I haven't had a chance yet to look into it.  of course the data will be coming from either VB or VB.NET (haven't decided which yet).

    Jim


    Kindest Regards,

    Jim

  • If you give me the format of the file you want to insert to the database (and maybe the table's DDL you want to insert in) I will tell you how to do. You can use my email as well if you wish to send the file with the description.

    Normally your GPS data is coming in a flat file in a common NMEA sentence format what you can translate. This translation can be done either at the interface program level or within the database itself.

    If you do it at the interface program level you will have to do some programmation but the performance will be the best. If you do it at the database level, you will have to program only in SQL so it will b a bit slower and you will need a worktable where you will put the data.



    Bye
    Gabor

  • Well I see you know GPS.  Basically the informatin is more or less in a CSV string very similar to the raw NMEA.

    I have a "work" table that I insert this data into along with something to identify the user and then I use triggers to cut up the data and feed it into other tables.

    There are a lot of other tables depending on the reason why the data was sent - alarm, start of a journey, end of a journey etc.

    That's it in short.  Any suggestions etc will be much appreciated.

    Jim


    Kindest Regards,

    Jim

  • Use this to insert the data into your workfile:

    The -t switch supposes that you have comma (,) as field separator Of cours you have to change all the My... data to your real settings as well as the codepage switch (-C) which is in my case 1250.

    bcp MyDB.MyUser.MyWorkTable in MyInputFile.csv -c -C1250 -SMyServer -T -t, -r\n

    In your place I would first empty the worktable (truncate table), disable (drop) every index and trigger on that worktable. Then load the data, run a stored proc (as you would use a trigger) to dispatch the "real" data into your real tables. At the end you could empty the table again (this is just to avoid loading the same data twice)

    I would make an SQLServer Agent job, put everyting inside, set the frequency and that's all. Everything should run without any manual intervention. No external program needed, fast and reliable...

    You just have to ensure that the data is put to the right place wher you SQLServer can "see" it.



    Bye
    Gabor

  • interesting way of doing it. Thanks.

     


    Kindest Regards,

    Jim

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply