32bit vs 64bit

  • I ran into a problem this week while migrating a large project to execute in our Server. The packages were developed on a 32 bit workstation and run fine in that environment. The only exception was that the packages run in parallel which takes a very large about of memory. So, to remedy this we were going to execute the packages from the dedicated server. I moved the packages over and tried to run them from the server, and got some rather non-descriptive errors. It finally dawned on me that this was due to the server being a 64bit machine. I was able to get around this by scheduling them as jobs and execute with the 32bit dtexec command. So my questions is: Is there a significant performance advantage to running these packages in 64bit mode? If the performance difference is significant, then we would need to rewrite them for 64bit mode. I wondering if the payoff is worth the effort. Does anyone have metrics that they can share in regards to packages that you've migrated to leverage the advantages of a 64bit machine?

    Thanks In Advance!

  • 1. Could you please post type of packages you are running ?Is this serving a datawarehouse ? if yes how many tables does it populate ?

    2. Briefly describe the process as well ?

    3.Also memory consumption patterns on 32 bit boxes with RAM size ?

    If this packages were developed newly and deployed for first time on production machine then you can skip the third question.

    Cheers

    🙂

    Cheer Satish 🙂

  • 1. Could you please post type of packages you are running?

    I'm not exactly sure what you mean by type of packages. Most of the Constructs in SSIS are being used in one package or another.

    Is this serving a datawarehouse?

    yes

    if yes how many tables does it populate ?

    Including the staging, aggregate, and DW tables, I would say somewhere in the neighborhood of 100 tables. The end result is probably 30 tables. The whole ETL process is comprise of about 30 different DTSX packages.

    2. Briefly describe the process as well ?

    Basically We're pulling from a source system and populating a group of staging tables. This is pretty much a direct copy. Then we read the staging tables, do some derivations and transformations, then load it into the EDW.

    this is a newly deployed ETL.

  • - The real difference between using 32 bit vs 64 bit is memory mapping, performance wise it has more benefits as it does not use VM concept of paging as 32 bit OS.

    -But to derive the most applications should be 64 bit complaint hence products have 64 bit releases nowdays, so moving the 32 bit code to 64 bit complaint is worth the effort.

    -ETL packages are processor intensive tasks as you are aware require firepower you can provide , also look up latest TPC ratings and case studies on this regard on Microsoft website for some guidelines.

    There should be some baseline benchmarking in UAT environment before you can justify value addition you can get from hardware environment you are planning to achieve.

    Sat 🙂

    Cheer Satish 🙂

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply