Who Tests the Code Testers?

  • It's an interesting point actually, with all the automated code testing tools, who is testing those tools? was the point of a recent column. Who verifies that the testing software works well, or tests what you need tested. No public benchmarks are available to help you figure out which is the best.

    Oracle recently talked about their problems and they're mentioned in this article as well. With 50 million lines of code, just finding a tool to test the code is hard. Getting it done in a reasonable amount of time is something else. But then ensuring the testing is actually done correctly? I can't fathom how that takes place.

    And it requires expertise. Oracle mentions that a few of its highly talented developers have to be pulled off work to read through the test results and interpret them. While Oracle probably doesn't like that, it's really the best way to ensure that the testing is done well. Use the best developers, the ones that really understand the code to look through the testing and ensure it's really working.

    It's quite a problem for large software vendors. It seems that maybe they should be in the tool testing business as well as their own software.

    Steve Jones

  • Not entirely relevant but Joel Spolsky (http://www.joelonsoftware.com) who as you know is pretty whizzy devotes a lot of time and thought to software testing. His latest blog post mentions  having one of their brightest developers (and he MUST be bright!) developing a testing tool for their new Ajax product. An interesting change of focus for software developers who traditionally have not been particularly interested in system testing (to their product's detriment, I believe).

  • Code Coverage utilities have been added to the repertoire of good developers that implement unit tests. The Code Coverage tools identify which areas of the code are not tested adequately. This is pretty low-level stuff and doesn't necessarily cover the requirements but there are tools for that too. Testing has come a long way in the past few years.

    [font="Tahoma"]Bryant E. Byrd, BSSE MCDBA MCAD[/font]
    Business Intelligence Administrator
    MSBI Administration Blog

  • Since I started my carreer as a hardware test tech and moved to test engineering, I've always been hyper aware of writing software that can be tested and in fact it shapes the way I write code to this day.  When I was a hardware test engineer I would force my way into the hardware design meetings and insist that the hardware be designed so that it could be tested.  Usually simple things like layout and circuits that could be isolated via a simple mechanism.  This sometimes delayed the product by a slight amount but we got it tested and made great strides in reliability when we could go in and test the product thoroughly.   Software needs to be designed the same way and unit testing is a very important aspect of that.  To accomplish the coverage required by very complex software there needs to be mechanisms to address the units as well as the whole.  It does impose limitations on the software engineers but then again you will see gains in reliability and that is what your reputation is based on.  If you have a wiz bang product that crashes frequently (unless you are Microsoft) you will lose to the slightly less wiz bang that's reliable. 

    My $.02

    Scott

     


    Kindest Regards,

    Scott Beckstead

    "We cannot defend freedom abroad by abandoning it here at home!"
    Edward R. Murrow

    scottbeckstead.com

  • I agree completely Scott! I often have to jump through some hoops to get my code organized in a testable form but the green bar is always worth it.

    [font="Tahoma"]Bryant E. Byrd, BSSE MCDBA MCAD[/font]
    Business Intelligence Administrator
    MSBI Administration Blog

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply