Skip to main content

Performance Test Parameters

The Parameters tab in the performance test results contains the parameters set for the test. In this article, you will find explanations for each of these parameters and their effects on the test results.

Test Duration

The number of minutes Botium generates the given load.

Test Duration determines when the last test step starts, not when the test ends.

Maximum Number of Parallel Users

You can simulate increasing load over time by adjusting the number of 'Parallel Users' that the test should end with. This effectively increases the number of parallel users after each test step until the desired number is reached.

The multiplicator is calculated as follows:

ConvoCountPerTestProject * (Initial Users + (i - 1) * Increase Initial Users)

Where:

  • ConvoCountPerTestProject is the number of conversations in the Test Project,
  • Initial Users is the base number of times to repeat the Test Set,
  • i is the current test step,
  • Increase Initial Users is the increment to apply for each subsequent step.

Test Step Duration

Botium generates load in iterations called test steps. During each test step, Botium adds a specified load to the processing queue, and the Botium Agents execute these as quickly as your chatbot can handle.

In this context, "load" refers to the conversations (convos) to be performed — these convos are part of the Test Set associated with the Test Project.
Tip: For Load/Stress testing, the steps are executed every 10 seconds. In Advanced mode, you can customize this interval.

Initial Users

During each test step, the content of the Test Set is added to the processing queue. To add it multiple times per test step—essentially repeating the same convos—you can increase the number of parallel users to be simulated. In the context of a stress tests, you would indicate the initial number of parallel users that the test should start with.

New Users per Test Step

This is based on the number of 'Parallel Users' that the test should end with.

Note: In this stress test example, we simulate an increasing number of users in the pattern 1, 3, 5, 7, 9, 11, 13. Starting with 1 initial user, the number increases at each test step for a total duration of one minute, with each step lasting 10 seconds. This results in:
  • 6 test steps
  • 2 new users per step (every 10 seconds)
  • 11 maximum users



Required Percentage of Successful Users

If the tests break the chatbot or if the chatbot has configuration errors and does not work at all, there is no point in continuing a long performance test. The percentage of successful users that has been set will be indicated on the parameters tab of the test results.

Example Usage:

In a stress test starting with 5 users and adding 100 new users per step, the time to consider a chatbot message failed is set to 2 seconds, and the required success rate is 50%.

  1. Step 1: 5 users
  2. Step 2: 105 users (5 + 100)
  3. Step 3: 205 users (5 + 2 x 100)

By Step 3, some users start failing because the chatbot is unable to respond within 2 seconds. When the user count reaches 305 in Step 4, more than 50% of the users fail, causing the test to stop.

Shared Conversation Session

By default, a separate Botium session is started for each convo execution. Depending on the connector technology, this can add extra time for session setup, slowing down the overall test execution (though it is not included in the measured response times). If you're not focused on measuring performance for individual user sessions and the connector technology doesn’t require separate sessions, you can enable this option to speed up the performance testing process.

Use Simple 'hello' Conversation

Load/Stress tests are repeating a simple 'hello' conversation when running. When using the Performance Advanced Test you have the option to change this by selecting one of your own Test Set(s). This will be indicated on the parameters tab of the test results.



Request Timeout

The request timeout is set to 2 seconds by default. When using the Performance Advanced Test you have the option to change this value. The set value will be indicated on the parameters tab of the test results.



Detailed Logs

Detailed logs are disabled by default. When using the Performance Advanced Test they can be enabled. This provides more detailed logs on the Agents which can be useful for bug fixing. This will be indicated on the parameters tab of the test results.



Email Notification

If configured in the test projects notifications it will be indicated on the parameters tab of the test results.

Was this article helpful?

0 out of 0 found this helpful