DTS Import Issue

  • I have a DTS package that imports serval txt files to different tables in my database.  These text files are new or changed records.  The text files have no record header.  The problem I am having is that some days there are no changed or new records in a few  of these files and they come as 0kb size.  The package runs through but reports as a fail because it cant import that file. 

    Is there anyway to put logic around this import?

     

  • Using Active X scripts, divide the retrieval and size check steps.  Use the File Scripting Object to check for the size of the file in the second step.  If it's zero, delete the file, and have the prior step reprocess.

    I've done this before, but unfortunately don't have immediate access to my code at the moment.  If that doesn't give you enough to go on, let me know, and I'll provide the code on Monday.

  • I have a system that imports a bunch of CSV files created from Oracle SQL*Plus scripts. I structured the DTS so that is the last/bottom procedure in the structure. The DTS package gets parameters on the command line that are accessed through a Dynamic Property "task".

    I found that this gives me flexibility in rehosting and preprocessing the files. It turns out that SQL Server scheduled jobs invoke DTS through the command line anyway, so it isn't too different from the point-and-click setup.

    The DOS command file ensures that the package is not called if the file is zero size. See the command line below. The values within paired % are DOS variables, while those like %%T are temporary variables. Note the "%%~zI"; it translates to the file size:

    FOR %%T IN (

    X, Y, Z

    ) DO (

    FOR %%I IN (%DFOLDER%\Data_%%T.*) DO ( IF %%~zI NEQ 0 DTSRun /S %SERVER% /E /N dataIngest_%%T /A File:8=%%~fI /A Server:8=%SERVER% /A Catalog:8=%CATALOG% /A Exceptions:8=%EFOLDER%\%%~nI.ERR )

    )

  • I'm facing exactly the same issue.  I have a DTS package that collects new and changed records that are posted from our campus administrative system to a set of text files each night.  There are 47 tables being loaded in this package, and I know that on any given import we may have one or more empty files.  The package runs fine, but it shows up as a failed job because of the failed steps.  Apparently the fact that the package is designed to step through the imports 'On Completion' and doesn't care if the individual steps fail because that is dealt with in the T-SQL on the merge step doesn't impress SQL Agent.

    I have resorted to dropping and recreating a dummy table as the final step in my package as a flag I can look at to see that the package has run and I ignore the job failure notice.  Clearly that isn't the most professional fix.  If there is a way to restructure the logic of my package so that it won't show the job as failing I'd love to learn more.  Code examples would certainly be appreciated if you have them.

  • Ah, vb...

    '**********************************************************************

    '  Visual Basic ActiveX Script

    '************************************************************************

    Function Main()

    Dim fso, f

    Set fso = CreateObject("Scripting.FileSystemObject")

    If fso.FileExists(DTSGlobalVariables("ImportFile").Value) then

     Set f = fso.GetFile(DTSGlobalVariables("ImportFile").Value)

     msgbox("File Size: " & f.size/1000 & " KB") 

     set f = Nothing

    End If

    set fso = Nothing

     Main = DTSTaskExecResult_Success

    End Function

     

     

    Signature is NULL

  • You can dynamically choose to skip past the import step when it is not needed:

    Use Calvin's script to evaluate the file size.  Based on that, you can choose to either enable or disable the step that imports.  If you need to disable it (oStep.DisableStep = True), set the precedence basis of the step downstream to status and value to inactive and you can skip over the step and still return success.  If you need the import step enabled (oStep.DisableStep = True), set the precedence basis of the step downstream to result and value to success - that will enforce your dependency on success of the enabled import step.

    [font="Courier New"]ZenDada[/font]

  • Hi,

    I have the same problem. I need to loop through a folder to retrieve each file data. For example, i have a folder called Summary. Inside this summary folder, i have files from each branch with the standard file name format.

    Eg. S0804B01, S0804B02

    The S indicate it is a summary file, the 0804 indicates that it is for August 2004 data, B01 and B02 indicate branch code.

     

    I need to loop through the whole folder (maybe there are 90 files) to check each file size. if the file size of a file is 0kb, i would need to log the file name to a table, else i will import the table into a temporary table.

    Is there any script that i can refer?

    Thanks n regards,

    Jasmine

  • '**********************************************************************

    '  Visual Basic ActiveX Script

    '************************************************************************

    Function Main()

     Dim obj,filDir,fil

     set obj = createObject("Scripting.FileSystemObject")

     set filDir = obj.getfolder("G:\ld_data")

     

     

     for each fil in  filDir.files

              msgbox fil.name

     next

     Main = DTSTaskExecResult_Success

    End Function

     

     

    Hope above Script Logic will help u

     

    Regards

     

    Agson Chellakudam

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply