Rules of thumb-perceptible changes in performance

  • Does anyone know of any citable reference regarding what percentage change in performance users are typically able to perceive? I've read that, generally speaking, a plus/minus 20% change in performance (transaction throughput, search results, etc.) is not readily perceived by typical application users. What sorts of authorities and resources are there on this topic?

    Thanks,

    Mark Denner

  • Usually, users tend to think something is performant the moment they perceive they can do their job without waiting for the machine.

    Intermediate improvements will only render them more polite towards you, not satisfactory.

    In the case of long processes, it's best to formalize their use, so that the users do make the difference between exceptional loads (boasting to their neighbours how large their processings are) and daily tasks on their GUI (complaining to their neighbours how slow their system is 🙂 ).

  • I agree to SeekQuel.

    in a typical OLTP environment having 100% degradation of a query which normally performing at 10 ms and now at 20 ms is not noticeable by the user.

    It really depends on your environment but I would say as a rule of thumb that when all OLTP like queries are finished at 80% below 1-2 sec the user is satisfied with the response time.

    But you as a DBA has to take care of those signals because it means that it is good time to make a DB maintenance, check the indexes etc..

    Bye

    Gabor



    Bye
    Gabor

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply