DTS Question

  • Hi

    I need the results of an Execute SQL task output to a text file.

    Please can someone tell me how to achieve this?

  • You cannot do this with an Execute SQL Task in general cases. However what is it that is it you are actually after? Number of record affected, the data returned something else?

    Most cases if data returned you use either a Transformation (Highligh 2 connections, right click either one and choose Transform Data Task) or Data Driven Query Task. Or in more complicated needs you create an ActiveXScript to do the work.

    Depends on what you are after.

  • At the moment I have two tasks. The first runs two queries, and based on the results of those queries, it then executes another query in the next task, the results of which I would like to pipe out to a text file.

    I am not familiar with the data driven query task - maybe I should check this out.

  • Can you walk me thru how you set up each step and how the next query is determined?

    Also is the next tasks query changing in any way?

     

  • Could you not create a text file destination and then use the transform data task to push to the text file? It can do everthing a SQL task can do but it can push to a file. Seems that would be the route to go.

  • I would use a combination.  Run the execute task to run your pre-queries.  Use the data driven task to run the query that outputs to the file. 

    The data driven task is well documented elsewhere.  You should have no trouble finding out how to do this.

  • You can accomplish this using the Execute SQL task. First create a Global Variable in the DTS package. Now in the Execute SQL task add your query and go the Parameters option and Select the Output parameters.

    Select the Rowset option and pick the Global Variable you created from the drop down. Click OK.

    Now Create an ActiveX task and add an ON Success Workflow from the Execute SQL to the ActiveX.

    Code your ActiveX something like this:

    Function Main()

    DIM oRS

    DIM oFS

    DIM oOutFile

    DIM sOutString

     Set oRS = CreateObject("ADODB.RecordSet")

     Set oFS = CreateObject("Scripting.FileSystemObject")

     Set oOutFile = oFS.CreateTextFile("E:\MyTestFile.txt",True)

     

     oRS.Open(DTSGlobalVariables("g_Results").Value) '--THIS IS THE GLOBAL VARIABLE

     

     IF oRS.EOF = CBOOL(False) THEN

      DO UNTIL oRS.EOF = CBOOL(TRUE)

       sOutString = ""

       sOutString = sOutString & oRS.Fields(1).Value

       sOutString = sOutString & oRS.Fields(2).Value

       sOutString = sOutString & oRS.Fields(3).Value

       sOutString = sOutString & oRS.Fields(4).Value

       sOutString = sOutString & oRS.Fields(5).Value

       oOutFile.WriteLine sOutString

       oRS.Movenext

      Loop

     End If

     Set oRS = Nothing

     oOutFile.Close

     Set oOutFile = Nothing

     Set oFS =Nothing

    Main = DTSTaskExecResult_Success

    End Function

  • While it's true you can do this this way, for your purposes the data driven task is a better choice.

    The main purpose for creating a recordset global variable with the Execute SQL task is that so it can be manipulated in later steps.  You do not have that requirement.  The data pump will send the data directly to the file.  You will not have to incur the overhead of looping through a rst and writing to the text file line by line by using it.

  • True, the datpump will be more efficient. The only advantages to my solution would be ensuring the file always exists where you expect it and you can now do some archiving using dates in the filename.

    Just providing alternatives.

  • You can use the dynamic properties task to make the output file name anything you want.  I use it to take in files when the names constantly change (and sometimes unpredictably so) and use it to create output files that are usually date time stamped.

    Alternatives are always good, but the DTS data pump can do the things listed in the previous post.

  • Sorry for delay in replying.

    I tried to use the transform data step, and although it parsed OK, when I ran it I got an invalid pointer error.

    I rolled the sql up into one as below:

    set nocount on

    DECLARE @iCountOfCustomerDateRecords INT

    DECLARE @iCountOfCustomerDateRecordsProcessed INT

    SELECT @iCountOfCustomerDateRecords = COUNT(CustomerRunGroup)

    FROM myDB.dbo.CustomerRunControl AS a

    INNER JOIN

    (SELECT MAX(CustomerRunGroup)AS CustomerRunGroupMAx

    FROM CustomerRunGroup) as b

    ON a.CustomerRunGroup = b.CustomerRunGroupMAx

    /* Select count of records from CustomerRunControl for the current month where the CustomerProcessingStatus is equal to 1000 (processing completed) */

    SELECT @iCountOfCustomerDateRecordsProcessed = COUNT(CustomerRunGroup)

    FROM myDB.dbo.CustomerRunControl a

    INNER JOIN

    (SELECT MAX(CustomerRunGroup)AS CustomerRunGroupMAx

    FROM myDB.dbo.CustomerRunGroup )as b

    ON a.CustomerRunGroup = b.CustomerRunGroupMax

    WHERE a.CustomerProcessingStatus = 1000

    /* If @iCountOfCustomerDateRecords is equal to @iCountOfCustomerDateRecordsProcessed, then Customer Result data is available for the current month. */

    IF ( ( @iCountOfCustomerDateRecords = @iCountOfCustomerDateRecordsProcessed ) AND ( @iCountOfCustomerDateRecords 0 ) )

    BEGIN

    SELECT PartyId,

    AgreementRefNo,

    ApplicationSeqNo,

    CustomerAccountRef

    FROM myDB.dbo.CustomerAccount

    END

    I'm not sure how to use the data driven task pump, but I will try to use that next, unless anyone can tell me why my transorm data task won't run!!

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply