Forcing ExecutionResult (the value, not the flow)

  • Oh - soooooooo close.

    TheBeast asked a question about forcing ExecutionResult. The detailed situation appears to have had more to do with execution flow. I want control of the actual ExecutionResult.

    I have a package that calls an ExecuteSQLTask. The stored procedure it executes encounters the infamous "Duplicate key was ignored". For those not lucky enough to have encountered it, this occurs when a unique index is defined WITH IGNORE_DUP_KEY and you attempt to insert a duplicate value. The statement succeeds despite an error message being returned, execution within the stored procedure continues, and all seems well.

    The ExecuteSQLTask, however, sees this as failure. Fine - I can figure out if the failure was real or "fake" and branch accordingly using scripting tricks enumerated elsewhere on this and other sites (thanks, folks).

    BUT - the overall package sees a step as having failed and my job executing this package shows failure.

    How can I set (override) the ExecutionResult of the step that has completed? I want to do something like

    set oStep = DTSGlobalVariables.Parent.Steps("MyFailingStep")

    oStep.ExecutionResult.Value = DTSStepExecResult_Success

    Am I missing something simple?

    Thanks in advance.

    Larry

  • This was removed by the editor as SPAM

  • Unfortunatly The ExcecutionResult Step Property is Read only.

    To bypass that behaviour you could handle errors through VB or use an activeX Script task to run your sp and handle errors on the script.

    ... VERY annoying


    * Noel

  • Why would you havbe a "unique" index that you allow duplicates?

    Could you call your stored proc from another proc (handling the 'error' in the new proc), then execute the new proc from DTS?

  • "why a unique index"

    IGNORE_DUP_KEY does not allow duplicates to be inserted. It successfully inserts unique rows while ignoring (not inserting) the non-unique rows. It is one alternative to INSERT WHERE NOT EXISTS. I would not have chosen this method had I done the design. I will also say the designers had some valid reasons for considering this direction.

    "wrapper proc"

    Unfortunately, DTS still sees the error from the "inner" proc as an error, even though the message from IGNORE_DUP_KEY is not really an error - it was doing exactly what it was designed to do. This is a known problem with IGNORE_DUP_KEY and is discussed in some knowledge base articles.

    Thanks,

    Larry

    Larry

  • The BOL states "SQL Server issues a warning and ignores the duplicate row."

    Unfortunately, the sql server issues an error with level 16.

  • Could you use a Data Driven Query Task instead of the ExecuteSQL Task? Then you could trap the insert failures and 'skiprow'

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply