10 April 2007 - 1.7 home user-guide eclipse jbossws intellij netbeans maven 1.X/2.X PDF files forums bugs sourceforge






Vote for soapUI at the WSJ Readers' Choice awards in the

'Best Web Services Utility' and

'Best Web Services Testing Tool'

categories

Load Testing

soapUI provides extensive load-testing functionality allowing you to do the following:

Any number of LoadTests can be created for a TestCase (using the TestCases "New LoadTest" popup-menu action), each with different strategies, assertions, etc. to validate/assess a TestCases and its TestSteps performance under different circumstances.

The documentation for load-testing in soapUI has been split into the following documents:

  • This document gives a background on the soapUI approach to loadtesting and an overview of the LoadTest Editor
  • Configuration : describes limits and strategies
  • Execution : describes the execution of LoadTests
  • Assertions : specifies the available LoadTest assertions and how they are used
  • Diagrams : describes the available diagrams during LoadTesting
  • JMeter Comparison : a comparison with the popular JMeter tool

Requirements-Driven Load Testing

One of our main objectives with the load-testing functionality in soapUI was to implement a "requirement-driven" approach to load-testing. Far too often, load-testing (in our experience) is performed to see "how fast" a certain web service / business process is. Although this may be interesting from a technical (and sometimes business) point of view, more often it is important the web service is "fast enough" for the actual business processes being realized, which is usually far below the actual performance achievable. Although this "fast enough" is usually very hard to define by the business owner, we believe that this is none-the-less the best approach when assessing the performance of a web service and/or its environment.

Based on this approach, soapUI allows you to define a number of LoadTest Assertions that are continuously applied to an ongoing LoadTest to validate that it performs "as required", for example the average response time of a request can be asserted to not go under a specified value for a longer period of time. Other assertions available include max-response time, TPS, etc...

This functionality can further be used for "surveillance testing", where a number of LoadTests are run periodically using some scheduling tool to assess that individual web services (for example in a SOA or an integration API) continuously perform as required. A setup for this is described in the Surveillance Testing document.

The LoadTest Editor

The LoadTest Editor gives an overview of the current LoadTest configuration, results and events:

The editor contains the following components (top-down)

  • The LoadTest toolbar containing a number of actions and the Test Limit settings for this LoadTest
  • The Load Strategy toolbar containing settings for the selected Load Strategy
  • The Statistics Table showing up-to-date statistics on the current LoadTest
  • A Tabbed Pane containing two tabs: a "LoadTest Log"-tab showing log-events for the current LoadTest and a "LoadTest Assertions"-assertions where the assertions for this LoadTest can be configured

The LoadTest Toolbar

The main toolbar contains the following actions (left to right):

  • Run : Starts the LoadTest as described under LoadTest Execution
  • Cancel : Cancels an ongoing LoadTest
  • Statistics Graph : Show the Statistics Graph for the LoadTest
  • Statistics History Graph : Show the Statistics History Graph for the LoadTest
  • Reset Statistics : Resets the statistics for an ongoing LoadTest
  • Export Statistics : Prompts to export the current LoadTest Statistics to a comma-separated file
  • Options : Shows the LoadTest Options dialog
  • Limit Settings : Sets the limit for the LoadTest as described in the Execution document
  • The far right contains a Progress Bar displaying the progress (in percent) of the current LoadTest exection

The LoadStrategy toolbar is described in the Execution document.

LoadTest Options

The LoadTest Options dialog contains the following settings:

  • Thread Startup Delay : Sets the startup delay for each thread (in milliseconds), setting to 0 will start all threads simultaneously
  • Reset Statistics when thread-count changes : Automatically resets statistics when the number of threads changes. Since (for example) the average is calculated using the number of threads, this value will be "influenced" by previous results from a different thread-count, which can be avoided by resetting the statistics when the number of threads changes
  • Calculate TPS/BPS based on actual time passed : By default, TPS (Transactions per second) is calculated with ( (1000/avg)*threadcount ), see Calculation of TPS/BPS. When setting a testcase delay using the Simple LoadStrategy, the avg will generally be very low, but the actual transactions per second will not be equivalently high (since there is a delay). Selecting this option will calculate TPS using (time-passed/cnt) instead
  • Include request write in calculated time : When selected, the time to establish the connection with the target endpoint and write the HTTP request will be included in the calculated time for Request test steps. Select this option if you want the "actual" request time measured (includes proxy, requests, authentication challenges, etc)
  • Include response read in calculated time : When selected, the time to read the HTTP response will be included in the calculated time for Request test steps. If not selected, only the time for reading the response HTTP headers will be included (the response content will still be read for assertions and eventual results viewing)
  • Close Connections after each request : Select this if you want to disable Keep-Alives/Connection reuse, which will result in a loadtesting scenario which resembles an environment with many different Web Service clients
  • Sample Interval : Sets the sample-interval for the LoadTest Statistics table (in milliseconds), default is 250ms.

The effect of these last three options on LoadTest results can be rather substantial for response times (depending on network configuration, etc) and is illustrated and discussed in the JMeter Comparison document.

The Statistics Table

The statistics table shows real time statistics for each TestStep in the underlying TestCase and for the TestCase total.

The Statistics currently collected are described in detail in the LoadTest Execution document.

Double-clicking a TestStep row in the table opens that TestSteps associated editor.

Right-clicking on a TestStep shows a popup-menu show TestStep specific actions and actions to add LoadTest Assertions for the selected TestStep as described under LoadTest Assertions.

The LoadTest Log

The LoadTest Log displays execution status messages and errors reported by the LoadTest Assertions. Double-clicking an error will open the errors result viewer (if available), for example allowing you to see the actual request/response that generated the error.

The top toolbar contains the following actions (left-to-right):

  • Remove Errors : removes all errors from the LoadTest Log
  • Export : prompts to export the current LoadTest log to a file
  • Show Types filter : filters which type of errors/messages that should be shown in the log
  • Show Steps filter : filters which steps that should be shown in the log

The table is sortable by clicking the column-header for the column to be sorted on. A label under the table displays the number of rows currently in the table.


Next: LoadTest Configuration