SlideShare a Scribd company logo
0
StresStimulus v4.2
User Interface
Reference
User Interface Reference v2
TABLE OF CONTENTS
1 Workflow Tree and Functional Area ..................4
1.1 Record Test Case ............................................................................................................ 5
1.1.1 Browser Cache.................................................................................................................. 6
1.2 Build Test Case ............................................................................................................... 7
1.2.1 Test Case Settings ............................................................................................................ 7
1.2.2 Authentication.................................................................................................................. 14
1.2.3 Variables ......................................................................................................................... 15
1.2.4 Parameters...................................................................................................................... 25
1.2.5 Response Validators ....................................................................................................... 30
1.2.6 Verify & Auto-config......................................................................................................... 31
1.2.7 Multi Test Cases.............................................................................................................. 36
1.3 Configure Test ............................................................................................................... 39
1.3.1 Load Pattern.................................................................................................................... 40
1.3.2 Test Duration................................................................................................................... 41
1.3.3 Browser Type .................................................................................................................. 42
1.3.4 Network Type .................................................................................................................. 43
1.3.5 Load Agents .................................................................................................................... 44
1.3.6 Monitoring ....................................................................................................................... 46
1.3.7 Result Storage................................................................................................................. 49
1.3.8 Other Options.................................................................................................................. 51
1.3.9 Script Editor..................................................................................................................... 52
1.4 Run and Monitor Test.................................................................................................... 53
1.4.1 Runtime Dashboard......................................................................................................... 53
1.5 Analyze Results............................................................................................................. 60
1.5.1 Opening Previous Results ............................................................................................... 60
1.5.2 Test Result Tab............................................................................................................... 61
1.5.3 Page and Transaction Result Tab ................................................................................... 68
1.5.4 Comparing-Tests............................................................................................................. 74
1.6 Workflow Tree Toolbar.................................................................................................. 75
1.7 Test Wizard .................................................................................................................... 76
1.7.1 Record Test Case............................................................................................................ 76
1.7.2 Configure Test Case........................................................................................................ 76
1.7.3 Configure Test................................................................................................................. 76
1.7.4 Run Test.......................................................................................................................... 76
1.7.5 Analyze Results............................................................................................................... 76
1.7.6 Record Test Case............................................................................................................ 76
1.7.7 Configure Test Case........................................................................................................ 77
WORKFLOW TREE AND FUNCTIONAL AREA - RECORD TEST CASE 1
User Interface Reference v2
1.7.8 Configure Test................................................................................................................. 78
1.7.9 Run Test.......................................................................................................................... 79
1.7.10 Analyze Results............................................................................................................... 79
2 Object Area .......................................80
2.1 Test Case Tree............................................................................................................... 80
2.1.1 Upper Toolbar ................................................................................................................. 80
2.1.2 Lower Toolbar ................................................................................................................. 82
2.1.3 Session Inspector............................................................................................................ 83
2.2 Session Grid .................................................................................................................. 84
3 Other Elements ....................................86
3.1 Main Menu...................................................................................................................... 86
3.1.1 Standalone Version ......................................................................................................... 86
3.1.2 Add-on (Integrated) Version ............................................................................................ 91
WORKFLOW TREE AND FUNCTIONAL AREA - RECORD TEST CASE 2
User Interface Reference v2
User interface (UI) reference explains the options that appear in StresStimulus windows, dialog
boxes and other UI elements. Much of this material is available in the UI. The purpose of this
document is to combine this information into a searchable document.
Because all topics are hierarchically organized, access to configuration settings, or functions can
be quickly located, not just by content, but also by context. For example, search results for the term
"think time" point to the page with the following path:
Workflow Tree and Functional Area -> Build Test Case -> Test Case Settings
While the topic, Test Case Settings, does not explain how to navigate to its functionality, its
contexts suggest that user has to click to on the Build Test Case node and then Test Case Settings
node of the Workflow Tree to configure page think time.
This section includes the help content embedded into StresStimulus and easy accessible as
contextual help.
StresStimulus help content includes the following information:
• Treeviews, toolbars, menus. Every node of a TreeView, a Toolbar button or an item of a
menu is presented by its icon, description and a tooltip, if exist.
• Help boxes: Virtually every window, dialog box and toolbar has one or several embedded help
boxes, each of which pop-up when mouse-over or click a corresponding light-bulb icon.
• Property grids. The name of properties and their description is provided for every object
displayed in a property grid.
• Data grids: Column names and their tooltips are provided for every grid.
WORKFLOW TREE AND FUNCTIONAL AREA - RECORD TEST CASE 3
User Interface Reference v2
1 WORKFLOW TREE AND
FUNCTIONAL AREA
This chapter describes every node on the
Workflow Tree and corresponding Functional
Area.
The Navigation tree topics in the
documentation hierarchically correspond to
the Workflow Tree in the application, so that
all topics found here can easily be located.
For example, if you search for "Data
Generators" in the documentation, just follow
the Navigation tree structure in order to find
Data Generators section in the application.
Workflow Tree includes four top-level nodes each of which match a related testing step:
Button Name Action
Record Test Case Create a Test Case by navigating your application OR open an
existing Test Case
Build Test Case Create and configure Test Case
Configure Test Configure Test and load level
Run and Monitor
Test
Start a load test and monitor performance …
Analyze Results Open test run results and analyze performance metrics
WORKFLOW TREE AND FUNCTIONAL AREA - RECORD TEST CASE 4
User Interface Reference v2
1.1 Record Test Case
Toolbar
Icon Action Description
Status: Recording
Status: Paused Click to Pause recording.
Resume Click to Resume recording.
Stop Close the browser and set the test case in StresStimulus.
Start New Transaction
End Transaction
Cache Clear browser cache and cookies.
Help Boxes
Recorder To record a test case:
1. Go to the tested website to start recording.
2. To skip recording of some pages, click "Pause".
3. Complete navigating web pages.
4. Click "Stop" to close the web browser.
A new test case will be created.
Notes:
To configure clearing browser cache, click "Cache"
(IE only)
See Also:
Recording with Web Browser
Recording from Other Sources
Transactions To create a transaction
1. Enter its name and Goal (optional).
2. Click "Start new transaction".
2. Complete navigating through transaction steps.
3. Click "End transaction".
WORKFLOW TREE AND FUNCTIONAL AREA - RECORD TEST CASE 5
User Interface Reference v2
See Also:
Creating Transactions
Transactions To create a transaction, enter its name and
complete navigating through transaction steps.
Entering a new transaction name will designate
the beginning of the subsequent transaction.
See Also:
Creating Transactions
1.1.1 Browser Cache
Help Boxes
Clear Browser Cache
Options
Recorder automatically clears resources
from the targeted domains on the list.
- Click View/Edit to access the Clear Cache
Domain List
- Check the "Clear" box to enable automatic
cache clearing before recording.
- Check the "After recording" box to automatically
add new targeted domain to the list
- Select what resources you want to clear.
See Also:
Automatic browser cache clearing
Clear Cache Domain List Clear Cache Domain List includes domains, which resources will
be cleared from the browser cache before recording test cases.
It is recommended to clear browser cache before recording a test
case. Clear Cache Domain List is used for automatic cache
clearing.
You can edit this list.
See Also:
Automatic browser cache clearing
WORKFLOW TREE AND FUNCTIONAL AREA - RECORD TEST CASE 6
User Interface Reference v2
1.2 Build Test Case
The Build Test Case section includes the following nodes:
1-st
Level
2-nd
Level
Name Description
Test Case Settings Configure Test Case, Page and Request Parameters
Transactions &
Loops
Transaction and Loop Settings
Authentication Set VU credentials
Source Variables Define Variables that can be use for
Parameterization
Extractors Define rules of extracting values from responses
Datasets Add or Edit Dataset
Data Generators Add or Edit Data Generator
Functions Add or Edit Function Instances
Parameters Use Source Variables to parameterize Requests
Response
Validators
Add or edit custom rules to validate responses
Verify & Auto
Config
Verify Test Case, find missing parameters and auto-
configure them
Multi Test Cases Configure multiple Test Cases
1.2.1 Test Case Settings
Toolbar
Icon Description
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 7
User Interface Reference v2
Expand All
Collapse All
Create a Transaction starting from the selected request
Create a Loop starting from the selected request or transaction
Edit the selected Transaction or Lo-op
Delete the selected Transaction or Loop
Help Boxes
Transactions & Loops A transaction is a group of sequential requests or pages,
representing a complete business transaction,
which is grouped for the purpose of tracking its performance.
A Loop is a group of sequential requests or pages,
requested multiple times within a test case iteration.
Unconditional Loops repeat a specified number of times.
Conditional Loops have a condition checked at the end of the
loop to determine if the loop should continue or exit.
See Also:
Transactions
Conditional Transactions
Loops
Test Case Settings The tree displays 5 type of objects: Test Case,
Loops, Transactions, Pages and Requests.
To change the selected object properties,
use the property grid below.
Toolbar commands:
- Expand / Collapse tree
- Create or Edit a transaction or loop
- Delete selected object
See Also:
Test Case, Page and Request properties
Test Case Settings
Context Menu
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 8
User Interface Reference v2
Icon Description
Insert Transaction
Insert Loop
Edit Transaction
Show on Tree
Show Session(s)
Clone Page
Delete Page
Properties
Test Case Properties
Property Description
Name The Test Case name
Description The Test Case description
URLs The number of requests in the Test Case
Request Size
(KB)
The size of all requests in the Test Case
Response Size
(KB)
The size of all responses in the Test Case
Duration (s) The total of the response times of all pages
Default Goal (s) The default goal for all pages in the test case. To remove the default goal,
leave the property empty.
Default Timeout
(s)
The default timeout for all pages in the test case.
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 9
User Interface Reference v2
Property Description
Think Times
between pages
Select "Recorded" to inject the recorded think time after every page. Select
"Constant" to use a constant think time. Select "Random" to randomize think
time. Tip: for stress tests, select "Zero".
Think Time (s) A constant think time between pages.
Min Think Time
(s)
Minimum think time
Max Think Time
(s)
Maximum think time
Delay after the
test case
Tips: For stress tests, select "Zero". To issue iterations with a certain
frequency, select Pacing.
Delay (s) A constant delay added after the test case replay
Iteration Delay (s) A constant delay between iterations
Minimum duration
of the test case
replay (s)
Enter the minimum test case replay duration. If the test case replays faster,
the appropriate delay will be added.
Cache Control Select "Enabled" to emulate browser caching and session management.
Select "Disabled" to emulate browsers with disabled caching (all requests will
be sent) and restarting browsers before starting a new iteration (browser
sessions will not persist across test iterations).
New VU % Percentage of the New vs. Returning VUs. Note: (a).On the 1-st iteration, new
VUs will have an empty cache, just as a first time user. All requests will be
sent. (b).On the 1-st iteration, returned VUs will have a primed cache.
Caching rules for each request will be determined based on server caching
headers. (c). On the subsequent iterations, all VUs are treated as returned
VUs.
VU restarting
browsers %
Percentage of VUs restarting browsers before starting new iteration. For
these users, browser sessions will not persist across the test iterations.
Page Properties
Page number The page number
Host The host
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 10
User Interface Reference v2
Path The path
Query The query string
Title The page title
URLs The number of requests in the page
P.I. URLs The number of Page performance impacting requests, which exclude AJAX
requests presumed to be loaded after the page is displayed
Request Size
(KB)
The size of the page's primary and all dependent requests
P.I. Request
Size (KB)
The size of the page's performance impacting requests
Response Size
(KB)
The size of the page's primary and all dependent responses
P.I. Response
Size (KB)
The size of the page's performance impacting responses
Duration (s) The page response time, a time required to receive all requests, excluding the
ones downloaded after the page is complete
Goal (s) The page expected maximum time for all its responses to come back. Iterations
where the page response times exceeded the goal are marked as "missed
goal". To remove the goal, leave it blank.
Think Time (s) The think time is a delay added at the end of the page to simulate the user’s
wait time before requesting the subsequent page
Timeout (s) The maximum amount of time for receiving any of the page responses. A page
timeout error is triggered if any of the page sessions' response times exceeds
the Timeout.
When to
Request the
Page
Select "On 1-st iteration" to skip this page (e.g. login) on the subsequent
iterations.
Select "On last iteration" to request this page (e.g. logout) on the last iteration
only, if the test is set to run a specified number of Iterations.
Request Properties
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 11
User Interface Reference v2
Property Description
Host The host
Path The path
Query The query string
Timeout
(s)
The maximum amount of time for receiving the request. A timeout error is triggered if
the session's response time exceeds the Timeout.
Caching
Rules
Select "Not Cached" to always request the session disregarding recorded caching
headers. Select "Cached" to never request the session for returning VUs with
enabled caching. Select "Normal" to use the recorded caching headers.
Transaction Properties
Property Description
Name The Transaction name
Description The Transaction description
Goal (s) The transaction's completion time limit. To remove the goal, leave it blank.
Think Time
(s)
The think time is a delay added at the end of the transaction to simulate the user’s
wait time before requesting the subsequent page
Loop Properties
Property Description
Name The Loop name
Description The Loop description
When to run
the Loop
Select "Always" to always run the loop. Select "Check a condition first" to
compare if an extractor matches a string to decide if the loop should run.
An Extractor
name
Select from the drop-down an Extractor to compare with the string
Loop Type Select Unconditional for a loop repeated specified number of times. Select
Conditional for a loop with an exit condition.
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 12
User Interface Reference v2
A string to
match
Specify a string to compare with the Extractor
Run the Loop
if Match?
Select Yes to run the Loop when the Extractor matches the String, and skip the
Loop when the Extractor does not match the String. Select No to skip the Loop
when the Extractor matches the String, and run the Loop when the Extractor
does not match the String.
Number of
Repeats
(Max.)
For an Unconditional Loop, set a number of repeats. For a Conditional Loop, set
a maximum number of repeats, which cannot be exceeded. To remove the cap,
set "-1".
Delay before
next Loop (s)
The delay before starting the next loop cycle (s)
Condition
Type
A type of the condition that is checked at the end of the conditional loop to
determine whether the loop should continue or exit. If the condition is based on
finding in an HTTP response a specified text or regular expressions, select "Text
Based". If the condition is based on evaluating an extractor, select "Extractor
Based".
Response to
Search
Select a response number from the drop-down, where the specified text or
regular expressions will be searched.
Search String Specify a character string that will be searched in the HTTP response.
Search String
Type
If the search string is a text, select "Text". If the search string is regular
expressions, select "Regular Expression"
Exit Loop if
Found?
Select Yes to repeat the loop, when "Search String" is not found, and exit the
loop, when "Search String" is found. Select No to repeat the loop, when "Search
String" is found and exit the loop when "Search String" is not found.
Extractor
Name
Select from the drop-down an Extractor that will be used in the "Extractor Based"
condition.
Text to
compare
Specify a text to compare with the Extractor value in the "Extractor Based"
condition.
Exit Loop if
Match?
Select Yes to repeat the loop, when "Text to Compare" does not match the
Extractor, and exit the loop, when "Text to Compare" matches the Extractor.
Select No to repeat the loop, when "Text to Compare" matches the Extractor,
and exit the loop, when "Text to Compare" does not match the Extractor.
Run
Transaction if
Match?
Select Yes to run the Transaction when "Text to Compare" matches the
Extractor, and skip the Transaction, when "Text to Compare" does not match the
Transaction. Select No to skip the Transaction when "Text to Compare" matches
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 13
User Interface Reference v2
the Extractor, and run the Transaction, when "Text to Compare" does not match
the Transaction.
1.2.2 Authentication
Toolbar
Button Action
Import a .csv file with user credentials
Help Boxes
Server Authentication Use this grid when the tested website uses Basic, Windows
Integrated (e.g. NTLM) or other Kerberos authentication.
Enter users’ credentials in the grid
or
click Import on the toolbar and select a .csv file with the user's
credentials. The csv file must have 3 grid columns and no header.
Multiple user credential rows are consumed by the VUs
in a round-robin order.
Note: For Form authentication, in the Datasets section, create an
Authentication dataset and use it in the Parameterization section.
See Also:
Authentication
Authentication Columns
Column Description
Domain Domain to authenticate a VU
Username Username to authenticate a VU
Password Password to authenticate a VU
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 14
User Interface Reference v2
1.2.3 Variables
Help Boxes
Variables Variables are evaluated and used during the replay:
1. In Parameters, to replace request recorded values.
2. In Actions, for custom processing (coming soon).
Supported Variables types: Datasets, Extractors,
Data Generators, and Functions
See Also:
Parameterizing dynamic tests
1.2.3.1 Extractors
Toolbar
Button Action
Expand All
Collapse All
Create an Extractor for a response selected in the Test Case Tree or the session grid
Edit the selected Extractor
Show the selected Extractor in the Test Case Tree
Show the associated with the Extractor response in the session grid
Move the selected Extractor to the selected response
Clone the selected Extractor to the selected response(s)
Hide autocorrelation parameter details
Show autocorrelation parameter details
Delete the selected Extractor(s)
Help Boxes
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 15
User Interface Reference v2
Extractors An Extractor defines a rule of retrieving a value from
a response. The value is assigned to a variable with
the same name that can be used to parameterize
subsequent requests.
See Also:
Extractors
Extractor Properties
Property Description
URL The URL of the response that will be parsed to extract a value
Name The Extractor and its variable name Tip: to use the name from the response
viewer below, click "Set the selected text as Extractor Name".
Text Before A text delimiter that occurs immediately before the value to be extracted. Tip: In
the response viewer below, select a string before the Extractor value and click
"Set the selected text as Text Before". If entering manually: for "new line", use
n; for "Tab", use t
Text After A text delimiter that occurs immediately after the value to be extracted. Tip: In
the response viewer below, select a string after the Extractor value and click
"Set the selected text as Text After". If entering manually: for "new line", use n;
for "Tab", use t
Occurrence The occurrence of the matching text to be extracted. The default is 1.
Use HTML
Decoding
To apply HTML-decoding to the Extractor value, select Yes. Example of HTML-
decoding: converting ">" to ">". The default is No.
Use URL
Decoding?
To apply URL-decoding to the Extractor value, select Yes. Example of URL-
decoding: converting: "%3F" to "?". The default is No.
Enforce URL
encoding?
The default is No. When using an extractor in a parameter, StresStimulus
automatically URL encodes its value when necessary. Select Yes to enforce the
URL encoding of the extractor's value. Use this option with caution as it can
result in double-encoding in a parameter.
Returned
recorded
value
The value returned by the Extractor from the recorded response.
Description The Extractor description.
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 16
User Interface Reference v2
Property Description
Regular
Expression
The regular expression with a single value <val> which will be extracted.
Example: w+d="(?<val>.+)" returns a value of the name/value pair, where the
name ends with a digit.
Header The response header name, selected from the drop-down.
Form Field The form field name, selected from the drop-down.
XPath query Enter XPath query to extracts a value from a web service XML response.
Create Extractor
Toolbar
Button Action
Set the selected text as "Text Before"
Set the selected text as "Text After "
Set the selected text as an Extractor Name
Find Previous (Shift+F3)
Find Next (F3)
Verify the Extractor
Save the Extractor and close this window
Save the Extractor and start a New one
Help Boxes
Create Extractor To Create an Extractor:
1. Select a session in the Test Case Tree or session grid.
2. Select an Extractor Type from the list above.
3. Configure Extractor's properties.
4. Verify the Extractor.
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 17
User Interface Reference v2
5. Click Save.
For help with specific extractor types, check the light bulb
on the right.
See Also:
How to create an Extractor
Edit Extractor To Edit the Extractor:
1. Change Extractor's properties.
2. Verify the Extractor.
3. Click Save.
For help with specific extractor types,
check the light bulb on the right.
See Also:
How to create an Extractor
Extractor Type Extractor Type defines a text search rule.
1. Text Delimited type extracts a value that occurs in the
response between two specified text strings.
2. Regular Expression type extracts a value that is found
in the response using regular expression search.
3. Header type extracts a value of a specific response
header selected in the Header drop-down.
4. Form Field type extracts a value of a specific web form
field selected in the Form Field drop-down.
5. XPath query type extracts a value that is found in
an XML response body.
See Also:
Text Delimited Extractor
Regular Expression Extractor
Header, Form Field and XPath Extractor
Next / Previous Extractor If more than one extractor value can be found in the
response, continue clicking Next/Previous Occurrence
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 18
User Interface Reference v2
until you find the correct value. Then click
Set the Occurrence to properly adjust this property.
1.2.3.2 Datasets
Toolbar
Button Action
New - Create a new Dataset
Create Authentication Dataset
Import a CSV file as a Dataset
Edit the selected Dataset
Export the Dataset to a .csv file
Delete the selected Dataset
Help Boxes
Datasets A Datasets is a predefined set of records.
Datasets are used in request parameters to:
- retrieve a value from the Datasets,
- assigned the value to a variable and
- replace a recorded value in the request
with the variable.
The variable name is
.< column name>.
See Also:
Datasets
Dataset Commands 1.To add a Dataset, click New to create an
empty Dataset that can be populated manually
or by pasting values.
Or
Click "Import a CSV file as a Dataset".
2.To create a dataset for Forms Authentication,
click Authentication Dataset.
3.To edit data, select a Dataset from the drop-down.
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 19
User Interface Reference v2
4.To edit the Dataset structure, click Edit.
5.Other available operations are Export Data and Delete.
Note:
- URL encoding in .csv files is supported.
- For Basic, Windows Integrated (e.g. NTLM) or other
Kerberos authentication use Authentication section.
See Also:
Datasets
Add/Edit Dataset Structure To add a field, enter the field name and click
"Add Field". Double-click the field to rename it
To reposition, rename or delete a selected field,
use the Up, Down, Rename or Remove buttons.
To create or rename a Dataset, enter a
one-word Dataset name.
See Also:
Datasets
1.2.3.3 Data Generators
Toolbar
Button Action
New - Create a new Data Generator
Verify the value returned by the selected Data Generator
Delete the selected Data Generator
Help Boxes
Data Generators A Data Generator returns a random value
assigned to a variable with the same name,
which can be used to parameterize requests.
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 20
User Interface Reference v2
See Also:
Data Generators
Data Generator Properties
Property Description
Name The Data Generator name
Min Value The minimum generated value
Max Value The maximum generated value
Format
String
A string specifying a format of presenting a value. For details about different data
type Formats, check "Formatting Types" in .NET
When To
Evaluate
Set to On Iteration to generate a new value on every iteration (default). Set to On
Request to generate a new value on every request
Description The Data Generator description
Type Select Random to generate random integers between the Min Value and Max
Value. Select AutoIncrement to generate sequential integers starting from the Min
Value; after the Max Value is reached, the next integer is the Min Value. This is
only used for Integer Data Generators.
When To
Evaluate
Select On Iteration to generate a new value on every iteration (default). Select On
Request to generate a new value on every request.
Create Data Generator
Toolbar
Button Action
Save the Data Generator and close this window
Save the Data Generator and create a New one
Verify the value returned by the Data Generator
Help Boxes
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 21
User Interface Reference v2
Data Generator Types Data Generator Type determines
what random value will be returned:
Integer, Double, Date/Time, GUID
or Text.
See Also:
Data Generators
Create Data Generator To Create a Data Generator:
1. Select its Type from the list on the left.
2. Configure its properties.
3. Click Verify to test the returned value.
4. Click Save.
See Also:
Data Generators
Data Generator Properties
Property Description
Name The Data Generator name
MinValue The minimum generated value
MaxValue The maximum generated value
FormatString A string specifying a format of presenting a value. For details about different
data type Formats, check "Formatting Types" in .NET
When To
Evaluate
Set to On Iteration to generate a new value on every iteration (default). Set to
On Request to generate a new value on every request
Description The Data Generator description
1.2.3.4 Functions
Toolbar
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 22
User Interface Reference v2
Button Action
New - Create a new Function Instance
Delete the selected Function Instance
Help Boxes
Function Instances A Function Instance returns a dynamic value
that depends on the function type and the
context where the function is called from.
The value is assigned to a variable with the
same name, which can be used to parameterize
a request.
The Function Instance is called before the
issuing the request.
See Also:
Functions
Properties
Property Description
Name The Function Instance name
Function Type Determines what internal variable or constant will be returned by a Function
Instance. The options are:
• Agent Name
• Agent VU Number
• Test Case Name
• Agent Iteration Number
• URL Number
• Agent Request Number
• Current DateTime
• Current UTC DateTime
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 23
User Interface Reference v2
Property Description
Use Unix Time
format?
This property is available in Current Date Time and Current UTC Date Time
functions; Select Yes to return the number of milliseconds that have elapsed
since Jan-1, 1970. Select No (default) for all other formatting options.
Format String A string specifying a format of presenting a value. For details about different
data type Formats, check "Formatting Types" in .NET
Description The Function Instance description
Create Function Instance
Toolbar
Button Action
Save the Function Instance and close this window
Save the Function Instance and create a New one
Help Boxes
Function Types Supported Function Types:
- AgentName - name of the current Agent
- AgentVUNumber - the current VU number within an Agent’s pool of
VUs
- TestCaseName - the name of the current Test Case
- AgentIterationNumber - the current iteration number executed by an
Agent
- URLNumber - the current URL number within a test case
- AgentRequestNumber - the current request number issued by an
Agent
from the beginning of the test
- Current DateTime - the current date/time
- Current UTC DateTime - the current UTC date/time
See Also:
Functions
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 24
User Interface Reference v2
Create Function
Instance
To Create a Function Instance:
1. Select a Function Type (described in the light bulb on the left).
2. Configure the Function Instance properties.
3. Click Save.
See Also:
Functions
Function Properties
Property Description
Name The Function Instance name
Format
String
A string specifying a format of presenting a value. For details about different data
type Formats, check "Formatting Types" in .NET
Description The Function Instance description
1.2.4 Parameters
Toolbar (All parameterization controls)
Button Action
Show the selected Parameter in the Test Case Tree
Save
Undo to Last Saved (Ctrl+Z)
Restore to recorded
Switch to Parameterization Editor
Switch to Parameterization Grid
Switch to Free Format Request Editor
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 25
User Interface Reference v2
Hide autocorrelation parameter details
Show autocorrelation parameter details
Find and Replace parameters' values (Ctrl+F)
Text box Find Value
Find Next (F3)
Find Previous (Shift+F3)
Label Session Number
Header Parameterization Grid
Column Description
Header Request header name
Recorded Value Request header recorded value
Replace with Parameterization expression replacing the recorded value
Query String Parameterization Grid
Column Description
Query Parameter Query string parameter name
Recorded Value Query string parameter value
Replace with Parameterization expression replacing the recorded value
Web Form Parameterization Grid
Column Description
Form Field Form Field name
Recorded Value Request header recorded value
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 26
User Interface Reference v2
Column Description
Replace with Parameterization expression replacing the recorded value
Global Find and Replace ... (Ctrl+F)
Help Boxes
Parameterization Controls Which Parameterization Control to use:
- Parameterization Grid: for configuring name/value
pair parameters.
- Parameterization Editor: for configuring name/value
pair parameters with long values or when
Find and Replace is needed.
- Free Format Request Editor: for configuring free format
requests.
See Also:
Parameterization Controls
Parameterization Grid Parameters change recorded requests during the replay.
The new values are derived from Variables.
To create a Parameter:
1. Select a request in the Test Case Tree.
2. Select the tab above for a necessary request
part: Header, URL and Query, or Body.
3. Click "Replace with" column of the Parameter.
4. In the appeared Variable Picker, select the Variable.
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 27
User Interface Reference v2
See Also:
Parameterization Grid
Parameterization Editor Parameterization editor displays name/value pairs as
- Read only blue "name line" and
- Editable "value line"
To edit data:
1. Optionally select text in a "value line" that should
be replaced.
2. Right-click in the "value line" and select a Source
Variable in the appeared Variable Picker.
- To find a text, enter it into the "...Find" box and
click Find Next. For advanced search click
Global Find and Replace.
- Click Save to save the changes..
- Click Undo to the Last Saved to discard the changes.
See Also:
Parameterization Editor
Free Format Request Editor Select or search the text to be parameterized.
Right-click and in the appeared Variable Picker,
select a Variable.
- To find text, enter it into the "...Find" box and
click Find Next. For advanced search click
Global Find and Replace.
- Click Save to save the changes.
- Click Undo to discard the changes.
- Click Restore to restore the recorded request.
See Also:
Free Format Request Editor
1.2.4.1 Variable Picker
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 28
User Interface Reference v2
Help Boxes
Variable Picker - Select a Variable in the Extractor,
Data Generator or Function category.
- Or select a Variable as a Dataset,
field and databinding method.
The Variable will be injected into the
request and will replace any selected text.
DataBinding Methods The databinding method determines the order of assigning dataset
rows to request parameters.
- Request-Bound databinding method: Every parameter requested
by any VU in any iteration gets a subsequent dataset row.
- VU-Bound databinding method: Every VU gets a subsequent
dataset row used f+G77 or all its parameters requested in all iterations.
- Iteration-Bound databinding method: Every iteration gets
a subsequent dataset row used by all VUs in all requested parameters.
- Iteration-Request-Bound databinding method: Every subsequently
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 29
User Interface Reference v2
requested parameter in every iteration gets the subsequent dataset
row shared by all VUs.
- VU-Iteration-Bound databinding method: Every VU in every iteration
gets a subsequent dataset row used in all its requested parameters.
- Parameter- Bound databinding method: Every requested parameter
gets a subsequent dataset row shared by all VUs in all iterations.
- Random databinding method: Every request parameter gets
a random dataset row.
1.2.5 Response Validators
Toolbar
Button Action
Create a Validator for a response selected in the Test Case Tree or the session grid
Show the selected Validator in the Test Case Tree
Show the response associated with the Validator in the session grid
Move the selected Validator to the selected response
Clone the selected Validator to the selected response(s)
Delete the selected Validator(s)
Help Boxes
Validators A Validator is a rule of comparing a response
with a text pattern. In case of mismatch,
a custom error is raised.
See Also:
Validators
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 30
User Interface Reference v2
Validator Properties
Property Description
URL The URL of the response that will be validated
Text to search Text/HTML or regular expression to search for in the response
Is text a regular
expression?
Select Yes if the validation string is a regular expression.
Fail If Choose whether to raise the error when the string is Found or Not Found in
the response.
Scope Set Scope to "Selected Response" to create a Validator for a request
selected in the Test Case Tree or in the session grid. Set Scope to "All
Responses" to create a global Validator.
Description Validator Description
1.2.6 Verify & Auto-config
Verify & Auto-config Toolbar
Button Action
Click to auto-verify the Test Case
Drop-down Click drop-down to select Full or Quick verify method.
Full Verify with preview of web pages
Quick Verify without preview of web pages
Enter a session #. Verify will stop after this session.
Run Parameter Finder.
Verify & Auto-config Help Boxes
Verify Test Case When verifying the Test Case:
1. The test runs one time in debug mode. Test
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 31
User Interface Reference v2
Configuration settings do not affect this run.
2. Replayed sessions are automatically compared
with the corresponding recorded sessions, and
deeply analyzed.
3. Errors, warnings, configuration recommendations
and other diagnostics are displayed in Session
Verification Tree and Extractor Verification Tree.
- To change the number of VUs, enter a different VU number.
- To select the Full or Quick Verify method, click the drop-down.
- To stop Verify earlier, specify a session number,
after which to stop.
See Also:
Verifying Test Case
Parameter Finder Parameter Finder finds possible missing extractors and
parameters. Creating them can fix configuration errors
and make test more realistic.
Run Parameter Finder after running "Verify"
See Also:
Parameter Finder
1.2.6.1 Session Verification
Session Verification Toolbar
Button Action
Expand All
Collapse All
Compare the selected recorded and replayed sessions
Click to show all verified sessions
Click to show sessions with errors
Click to show sessions with warnings
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 32
User Interface Reference v2
Click to show sessions with Notifications
Session Verification Help Boxes
Session Verification Tree Session Verification Tree matches the recorded
and replayed requests in the Test Case.
- To compare recorded and replayed sessions
selected on the tree, click Compare button.
- To view session content, double-click
Recorded or Replayed node.
See Also:
Comparing Sessions
Session Filtering To display a subset of sessions, click one of the
filtering buttons on the toolbar:
- URLs: all sessions;
- Errors: sessions with errors related
to the test configuration;
- Warnings: sessions with issues that may be related
to the test configuration;
- Notifications: sessions with issues unrelated
to the test configuration.
1.2.6.2 Parameter Finder
Parameter Finder T oolbar
Button Action
Click to Group by Extractors
Click to Group by Requests
Expand All
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 33
User Interface Reference v2
Collapse All
Parameter Creator: Auto-Configure the Extractor and all Parameters in the selected
node
Parameter Creator: Auto-Configure all Parameters in the selected node
Auto-Configure all Extractors and Parameters
Delete the selected Parameter Recommendation
Parameter Finder Help Boxes
Parameter Finder Tree Parameter Finder Tree displays possible missing
extractors and parameters. It has two views
1. Group by Request view displays:
a) On each parent node, a request requiring one
or several parameters;
b) On each child node, a parameter with matching
extractor.
2. Group by Extractor view displays:
a) On each parent node, an extractor that can be
used in one or several parameters.
b) On each child node, a parameter using the extractor.
Note: To copy selected object content, hit (Ctrl+C).
See Also:
Parameter Finder
Parameter Finder Tab To create Extractors and Parameters discovered
by the Parameter Finder, use the "Parameterization Tool".
It creates these objects one-by-one.
Auto Configurator creates all Extractors and
Parameters at-once.
See Also:
Parameter Creator
Auto-Configurator
1.2.6.3 Extractor Verification
Extractor Verification T oolbar
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 34
User Interface Reference v2
Button Action
Show the selected Extractor in the Test Case Tree
Show the associated with the Extractor response in the session grid
Delete the selected extractors and associated parameters
Extractor Statuses
Status Description
The extractor is OK.
The extractor is not found in the response.
The extractor's value is not used in the Test Case.
The recorded and replayed extractors are the same.
Extractor Verification Help Boxes
Extractor Verification Tree The Extractor Verification Tree is generated
When running "Verify". It displays extractor
values and the following exceptions:
- The extractor is not found in the response.
- The extractor's value is not used in the Test Case.
- The recorded and replayed values are the same.
The extractors with exceptions are automatically
checked for easy removal, as some of them may be
unnecessary.
See Also:
Verifying Test Case
Extractor Verification Properties
Property Description
Extractor name The name of the extractor
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 35
User Interface Reference v2
Value The value returned by the extractor during the verify
1.2.7 Multi Test Cases
Toolbar
Button Action
Open a session file as a Test Case
Import Test Cases from another Test
Click to view the selected Test Case and unlock it for changes
Clone the selected Test Case
Delete the selected Test Case
Export Test Case as an HTTP archive (.har)
Help Boxes
Multi-Test Cases Multi Test Cases are used to emulate different categories
of users.
- Test Cases are executed concurrently.
- VUs are distributed between the Test Cases proportionate
to their Mix Weight properties.
- To start configuring or reviewing the selected Test Case,
double-click it or click "Click to view …" button on the toolbar.
After that the entire Build Test Case section on the Workflow
Tree will be associated with this Test Case.
Note: Selecting the Test Case as Current, does not impact
concurrent execution of multi-test cases.
See Also:
Multiple Test Cases
Editing and Deleting Test Case
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 36
User Interface Reference v2
Multi-Test Case commands - To create a new Test Case, click "Record" or
"Open session file" and select an .saz or .har file.
- To import Test Cases from another Test, click Import.
- To clone the selected Test Case, click "Clone Test Case".
- To delete the selected Test Case, click Delete.
- To change the properties of the selected Test Case,
modify them in the property grid below.
See Also:
Editing and Deleting Test Case
Test Case Properties
Property Description
Name The Test Case name.
Description The Test Case description.
Mix Weight The relative frequency (in units) of the Test Case replays in the mix. Every VU is
assigned to a specific Test Case selected in a round-robin order, while skipping
some of them to achieve the VU distribution corresponding to the mix weights.
URLs The number of requests in the Test Case.
Think Times
between pages
Select "Recorded" to inject the recorded think time after every page. Select
"Constant" to use a constant think time. Select "Random" to randomize think
time. Tip: for stress tests, select "Zero".
Delay after the
test case
Tips: For stress tests, select "Zero". To issue iterations with a certain frequency,
select Pacing.
Cache Control Select "Enabled" to emulate browser caching and session management. Select
"Disabled" to emulate browsers with disabled caching (all requests will be sent)
and restarting browsers before starting a new iteration (browser sessions will
not persist across test iterations).
1.2.7.1 Test Case Groups
Toolbar
Button Action
Create a Test Case Group
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 37
User Interface Reference v2
Edit the selected Test Case Group
Delete the Test Case(s) from the Test Case Group
Help Box
Test Case Group When at least one Test Case Group is created,
Sequential-Concurrent TC mixing model is used:
- Test cases in a TC Group are executed sequentially.
- Multiple TC Groups are executed concurrently.
- VUs are distributed between the TC Groups
proportionate to their Mix Weight properties.
- Only test cases included in TC Group(s) are executed.
To go back to the Concurrent TC mixing model,
delete all TC Groups.
See Also:
Sequential Test Case Groups
Test Case Groups Dialog Commands
Button Action
Add Test Case(s) to the Test Case Group
Delete Test Case(s) from the Test Case Group
Move Test Case Up for earlier execution
Move Test Case Down for later execution
Test Case Group Properties
Property Description
Name The Test Case Group name
Description The Test Case Group description
WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 38
User Interface Reference v2
Mix Weight The relative frequency (in units) of the Test Case Group replays in the mix. Every
VU is assigned to a specific Test Case Group selected in a round-robin order,
while skipping some of them to achieve the VU distribution corresponding to the
mix weights.
Cache
Control
Select "Enabled" to emulate browser caching and session management. Select
"Disabled" to emulate browsers with disabled caching (all requests will be sent)
and restarting browsers before starting a new iteration (browser sessions will not
persist across test iterations).
New VU % Percentage of the New vs. Returning VUs. Note: (a).On the 1-st iteration, new
VUs will have an empty cache, just as a first time user. All requests will be sent.
(b).On the 1-st iteration, returned VUs will have a primed cache. Caching rules
for each request will be determined based on server caching headers. (c). On the
subsequent iterations, all VUs are treated as returned VUs.
VU restarting
browsers %
Percentage of VUs restarting browsers before starting new iteration. For these
users, browser sessions will not persist across the test iterations.
1.3 Configure Test
The Configure Test section includes the following nodes:
Load Pattern Configure load
Test Duration Configure Test completion criteria
Browser Type Configure Browser Mix
Network Type Configure Network Mix
Load Agents Configure Load Agents
Server Monitoring Configure servers' performance monitoring
Result Storage Configure Result Storage settings and the amount of saved data
Other Options Advanced Options
Script Editor Modify Test Script
Help Boxes
WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 39
User Interface Reference v2
Configure Test Navigate through every "Configure Test"
item and select desired test parameters.
Property Grid
Property Description
Test Run
Name
A one-word Test Run Name used as a suffix following a time-stamp of the next
test run displayed in the "Analyze Results" section. If the Test Run Name is empty,
the test file name is used as a suffix.
Test
Description
Test Description included into the test result Summary view
1.3.1 Load Pattern
Help Boxes
Load Pattern Load pattern defines dynamics of virtual users (VU)
throughout the test.
See Also:
Load Pattern
Load Pattern Properties
Property Description
Load Pattern Select "Steady Load" to keep a steady number of VUs. Select "Step Load" to
ramp-up the number of VUs on every step.
Number of VU The constant number of VUs emulated throughout the test.
Start VU The initial number of VUs in the beginning of the test.
Step VU
Increase
The number of VUs added on every step.
Step Duration
(s)
The time interval between increasing VU count.
Max VU Maximum VU count - Note: Test can complete before reaching "Max VU" if the
Test Duration is not long enough.
WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 40
User Interface Reference v2
Over (s) The amount of time taken in the beginning of each step to gradually add VUs.
For instant increase, use zero.
1.3.2 Test Duration
Help Boxes
Test Duration Set the test completion criteria. After reaching
this condition, the test will stop.
See Also:
Test Duration
Test Duration Properties
Properties Description
Test Completion
Condition
Select the test completion condition from the drop-down. The options are:
Number of Iterations; Run Duration; Reaching max VUs.
How to count
iterations
Select "Per VU" to set the iteration limit for each VU. Select "Total" to set the
overall iteration limit.
Max. iterations After the specified test completion condition is reached, depending on this
selection, test will stop, will wait until all pending responses are received, or
will wait for iterations to complete.
Load generation
time (hh:mm:ss)
Enter the duration of load generation, after which no request will be issued.
After the
Completion
Condition is
reached
After the specified test completion condition is reached, depending on this
selection, test will stop, will wait until all pending responses are received, or
will wait for iterations to complete. The options are: Wait for responses; Stop
the test; Wait for iterations to complete
Warm-up time (s) The warm-up period at the beginning of the test is necessary to prepare
server for normal working conditions. During the warm-up period the number
of VU is gradually ramped-up, server cache is populated, and necessary
modules are loaded into the memory. During the warm-up period
performance metrics are not collected.
WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 41
User Interface Reference v2
1.3.3 Browser Type
Toolbar
Button Action
Add - Click to add a browser to the mix - Then configure its settings
Delete - Click to delete the selected browser from the mix
Help Boxes
Browser Type Configure web browser settings. If necessary,
add more web browsers to the mix.
Tip: How StresStimulus emulates web browsers:
(a) It maintains the configured connections limits.
(b) It injects the appropriate user-agent string
into the requests.
(c) It maintains the browser mix distribution,
if more than one browser is selected.
See Also:
Browser Settings
Browser Properties
Property Description
Browser Type Select a web browser or "Custom" from the drop-down. Supported browser
types:
• IE11, IE10, IE9, IE8, IE7, IE6
• Firefox
• Chrome
• Opera
• Safari
• Non-browser application
WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 42
User Interface Reference v2
• Custom
Mix Weight Relative frequency (in units or percents) of using this browser by VUs.
Replace User-
Agent string
Select "True", to use a User-Agent string of the selected browser type,
instead of the recorded string. Select "False", to keep the recorded string.
User Agent If replacing the recorded User-Agent string, enter a custom string.
Connection limit
per host
To set custom browser performance, enter the maximum number of TCP
connections per host.
Connection limit
per proxy
To set custom browser performance, enter the maximum number of TCP
connections across all hosts.
1.3.4 Network Type
Toolbar
Button Action
Add - Click to add a network to the mix - Then configure its settings
Delete - Click to delete the selected network from the mix
Help Boxes
Network Type Configure network settings. If necessary,
add more networks to the mix.
Tip: Network type other than LAN is emulated
by injecting a certain wait time into every
request and response, weighted to its
size and the network type bandwidth.
See Also:
Network Settings
Network Properties
Property Description
WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 43
User Interface Reference v2
Network Type Select a network type or "Custom" from the drop-down.
Mix Weight Relative frequency of using this network by VUs
Upload Bandwidth (kbps) Enter the upload bandwidth (kbps)
Download Bandwidth (kbps) Enter the download bandwidth (kbps)
1.3.5 Load Agents
Toolbar
Button Action
Add a Load Agent connection
Edit the selected Load Agent connection
Test connections to the Load Agents with non-zero VU weights
Reset the Selected Agent
Delete the selected Load Agent connection
Help Boxes
Load Agents Load Agents are computers emulating virtual users
in the distributed test, orchestrated by this controller.
To create a Load Agent:
1. On a remote computer, install StresStimulus, and in the
StresStimulus menu -> Agent Options, enable Agent Mode.
2. On this computer, add a connection to the Load Agent.
See Also:
Attaching Agents to Controller
Load Agents Load Agents are computers emulating virtual users
in the distributed test, orchestrated by this controller.
WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 44
User Interface Reference v2
To create a Load Agent:
1. On a remote computer, install StresStimulus, and in the
StresStimulus menu -> Agent Options, enable Agent Mode.
2. On this computer, add a connection to the Load Agent.
To set the portion of the total number of VUs on this
Load Agent, change its mix weight property.
To set the total number of VUs in the test, navigate
to the Load Pattern section.
See Also:
User-Agent
Attaching Agents to Controller
Load Agent Properties
Property Description
Agent Name The Load Agent Name
StresStimulus
Version
The version of StresStimulus installed on the agent.
Host / IP
Address
The Load Agent Host. Enter a network computer name or IP address without
"//". Example: AGENT1 or 10.2.2.169
Mix Weight Relative number of VUs (in units or percents) emulated on the Load Agent. To
disable the Load Agents, set its mix weight to zero.
Starting Thread
Count
Starting thread count is the number of threads created automatically when test
is launched. If more threads are needed, the load engine will gradually create
more threads while checking available system resources. Increasing the starting
threads can increase load engine performance, but it also can overload systems
with limited resources.
VUs Constant number of VUs. (Read only)
Start VUs Starting number of VUs. (Read only)
Step VU
Increase
VU Step increase if the Step Load Pattern is used. (Read only)
Max VUs Maximum VUs if the Step Load Pattern is used. (Read only)
Username Username to access Remote Agent.
WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 45
User Interface Reference v2
Password Password to access Remote Agent.
1.3.6 Monitoring
1.3.6.1 Windows Servers and Agents
Toolbar
Button Action
Add A Machine To Monitor
Edit The Performance Counters
Delete Selected Objects
Fine Previous
Find Next
Help Boxes
Server or Agent Monitoring During the test run you can monitor multiple performance
counters on the web, application and database servers,
as well as agents. Real-time graphs and performance values
will be displayed on the real-time dashboards and in the
performance reports.
- Click "Add" to add a Server with default set of counters.
- Click "Edit" to add or delete performance counters to the
selected server.
- Click "Delete" to delete the selected object(s).
See Also:
Windows Servers Monitoring
Linux/UNIX Servers Monitoring
Threshold Rules
Add a Machine for Monitoring Enter a machine IP address or computer name
without "//". Example: 10.2.2.169 or WEB_SRV5
WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 46
User Interface Reference v2
Property Grid
Name Description
Machine A network computer IP address or computer name without "//"'
Domain Network Domain Name
UserName User Name on the network computer
Password Password
Category Performance Counter Category
Counter Performance Counter Name
Instance Performance Counter Instance
Enable
Threshold?
Select Yes to enable threshold. Default is No.
Warning
Threshold
Enter Warning Threshold value
Critical
Threshold
Enter Critical Threshold value
Alert if Over? Select Yes to indicate that exceeding a threshold is a problem. Select No to
indicate that falling below a threshold is a problem.
Add Performance Counters
Help Boxes
Add Windows Server Performance
Counters
1. Select a performance object, counter, and
instance.
2. Click "Add" to add it to the New Counter List.
Note: The performance counters are the same as in
Windows Perfmon application.
See Also:
Windows Server Monitoring
Add Agent Performance Counters 1. Select the local or remote Agent.
2. For remote Agents, enter an IP address or
computer
name without "//". Example: 10.2.2.169 or
WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 47
User Interface Reference v2
WEB_SRV5
3. Select a performance object, counter, and
instance.
4. Click "Add" to add it to the New Counter List.
See Also:
Windows Server Monitoring
New Counter List Highlight a counter below to see its description.
Click "Delete" to remove the highlighted counters
from the New Counter List.
Click "Save" to add the New Counter List to the Test.
See Also:
Server Monitoring
1.3.6.2 Linux/UNIX Servers
Help Boxes
Add Linux/Unix SNMP Performance
Counters
1. Enter a host IP address or name.
2. Change Community, if necessary.
3. Select a counter from the drop-down list.
4. To add counters which are not on the list, enter
OID.
5. Enter or edit the counter name.
6. Click Test to test the counter.
7. Click "Add" to add it to the New Counter List.
See Also:
Linux/UNIX Servers Monitoring
Listed SNMP performance counters
—CPU Counters —
Percentage of user CPU time
Percentage of system CPU time
Percentage of idle CPU time
— Memory counters—
Total swap size
Available swap space
Total RAM
Total RAM free
WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 48
User Interface Reference v2
Total RAM buffered
Total cached memory
1.3.7 Result Storage
Properties
Property Description
How much
data to store
Select All to store the fullest dataset. Select Partial to store all data except the
content of the individual HTTP sessions. If select None, no data will be stored in
the database, and only the last test result will be available until StresStimulus is
closed.
Data Storage Select the test result data storage from the drop-down. Note: SQL Server CE
capacity is limited to 4 GB.
SQL Server
connection
string
Click … to enter the SQL Server connection information in the pop-up window.
Purge request
bodies
Purging bodies of test sessions’ requests saves memory. Select All, to purge all
bodies. Select None, to keep all bodies. Select Non-Errors to keep all bodies
with Errors
Purge
response
bodies
Purging bodies of test sessions’ responses saves memory. Select All, to purge
all bodies. Select None, to keep all bodies. Select Non-Errors to keep all bodies
with Errors. Select Static Mime Types, to purge bodies of images and other
static resources.
Save sessions
from agents?
In distributed tests with SQL Server CE-based storage, the content of the
sessions generated on the agents is stored on the agents. Select Yes, to copy
this content to the controller. This will allow generating waterfall diagrams for
VUs emulated on the agent. Select No, to reduce the traffic between agents and
controller when the network bandwidth is limited.
Help Boxes
Result Storage Test results are stored in a database. Configure
Data Storage type, and amount of data to store
See Also:
Test Result Storage
Connection Settings Click Create/Check DB to create a new database
or verify connection to the existing database.
WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 49
User Interface Reference v2
Click OK to set the database as data storage for the Test.
Click Cancel to go back without changes.
See Also:
Test Result Storage
1.3.7.1 Test Pass/Fail Qualification
Help Boxes
Test Pass/Fail Configuration You can configure several test quality criteria.
When at least one of such criteria missed,
the test is qualified as Failed.
To create a test quality criteria, in the property
grid enable a Pass / Fail condition and specify
its acceptable limit.
See Also:
Configuring Test Pass/Fail Qualification
Property Grid
Property Description
Page Goal Misses Enter Yes if Page Goal Misses is subject to the test's Pass / Fail condition.
Page Goal
Threshold
Enter % of Page Goal Misses that triggers the Fail condition. Enter 0 to fail
the test with a single Page Goal violation.
Transaction Goal
Misses
Enter Yes if Transaction Goal Misses is subject to the test's Pass / Fail
condition.
Transaction Goal
Threshold
Enter % of Transaction Goal Misses that triggers the Fail condition. Enter 0
to fail the test with a single Transaction Goal violation.
Request Errors Enter Yes if Request Errors are subject to the test's Pass / Fail condition.
Request Error
Threshold
Enter % of Request Errors that triggers the Fail condition. Enter 0 to fail the
test with a single Request Error violation.
Request Timeouts Enter Yes if Request Timeouts are subject to the test's Pass / Fail
condition.
WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 50
User Interface Reference v2
Property Description
Request Timeout
Threshold
Enter % of Request Timeouts that triggers the Fail condition. Enter 0 to fail
the test with a single Request Timeout violation.
1.3.8 Other Options
Properties
Property Description
Graph sample
rate (s)
Enter how often the performance counters are read and graph are refreshed.
Recommended value is 10s with agents and 5s without the agents. Increase
"Sample Rate" for long tests.
Pre-run
Command Line
A command line to execute before the test starts.
Pre-run
Command
Timeout
A timeout limit for the pre-run command to complete.
MIME Types
requested
sequentially
Click the drop-down and enter MIME types whose requests must be issued
only after receiving all previous responses (sequentially). Some MIME types
(e.g. Text/HTML) are always requested sequentially. You can enter additional
MIME types to prevent dependent requests of these types from being
requested in parallel with other dependent requests on a page. Separate
multiple entries by ",". For example, enter "image,video" to request all images
and videos sequentially; enter "video/mp4" to Request MP4 video sequentially.
Enable
Dynatrace
integration?
Select Yes to add to each issued request the x-dynaTrace header.
Help Boxes
Other Options For information about properties in this
section, check the following sources:
Pre-run command line
Dynatrace Tntegration
WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 51
User Interface Reference v2
1.3.9 Script Editor
Toolbar
Button Action
Save Script (Ctrl+S)
Save and Exit Script Editor
Validate against SSScript XSD
Show XSD, the SSScript schema document
Cut (Ctrl+X)
Copy (Ctrl+C)
Paste (Ctrl+V)
Undo (Ctrl+Z)
Redo (Ctrl+Y)
Find (Ctrl+F)
Find Next (F3)
Find Previous (Shift+F3)
Bookmark (Ctrl+F2)
Bookmark Next (F2)
Bookmark Previous (Shift+F2)
Help Boxes
Script Editor Script is an XML representation of the Test Object Model (TOM).
Test modifications can be completed by editing the script
or by changing the corresponding settings in the UI.
SSScript XSD
.
SSScript XSD is an XML schema of
StresStimulus Scripts. It is used to validate
WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 52
User Interface Reference v2
Test scripts and to find errors.
SSScript XSD is provided here as a reference for script development
1.4 Run and Monitor Test
Click Run Test in the Workflow Tree. The following MessageBox will appear .
• Click Run and Monitor Test to start test normally.
• Click Debug to run the test in debug mode. All replayed sessions will be displayed in the
session grid and response bodies will not be purged.
• Click Cancel to go back.
1.4.1 Runtime Dashboard
Toolbar
Button Action
Stop Click Stop to Abort the Test Run
Click Pause to suspend the Test
WORKFLOW TREE AND FUNCTIONAL AREA - RUN AND MONITOR TEST 53
User Interface Reference v2
Click Resume to resume the Test
Click to add VUs to the Test
Skip pending requests (trigger a timeout) and continue the test
Retrieve sessions from the Test Log (delayed)
Health Monitor - Normal
Health Monitor - High Load. CPU utilization approaching exceeds the acceptable range
Health Monitor - Overloaded. Stop unessential processes or reduce the number of VUs
Select Graph Layout
Graph Layouts
Option Description
One Graph
Two Horizontal Panels
Two Vertical Panels
Three Horizontal Panels
Three Vertical Panels
Three Panels Oriented Left
Three Panels Oriented Right
Three Panels Oriented Top
Three Panels Oriented Bottom
Four Horizontal Panels
Four Vertical Panels
Four Panels
Help Boxes
WORKFLOW TREE AND FUNCTIONAL AREA - RUN AND MONITOR TEST 54
User Interface Reference v2
Graphs The Graphs display instant performance characteristics
and performance counter's data, plotted with the
frequency defined by the Sample Rate period.
- To select the graph panel layout, click "Select Graph Layout"
- To select which graph to display in a graph panel, click
the drop-down above it.
For more graph commands, right click a graph.
See Also:
Runtime Dashboard
Agents / Test Cases Progress Grid
Test Run Commands - To stop the test, click "Stop".
- To Pause/Resume test run, click "Pause/Resume".
- To increase VU count on demand, set the
VU adjustment value and click "+".
- To abandon pending requests, click "Skip".
Note:
- VU Adjustment works with a "Steady Load" pattern only.
See Also:
Runtime Dashboard
Controlling the Test
Test Engine Health Status Help Box – Test Engine Health Status@@
For accurate load testing, CPU utilization should not
exceed 85%.
- Green: Normal. CPU Utilization is under 85%.
- Yellow: High Load. CPU utilization is 85-95%
and is approaching the acceptable limit.
- Red: Overloaded. CPU utilization exceeds 95%.
Metrics accuracy can be impaired. Stop unessential
processes or reduce the number of VUs.
See Also:
Monitoring Test Progress and Health
WORKFLOW TREE AND FUNCTIONAL AREA - RUN AND MONITOR TEST 55
User Interface Reference v2
1.4.1.1 Graphs
Graph Context Menu
Icon Name Description
Un-Zoom One Undo one Zoom
Un-Zoom All Full zoom-out
Unhide All Curves
Maximize Graph Switch this graph to the one panel layout
Show sessions in
range
Show sessions sent/received during the displayed time range (1
minute delayed)
Copy Image Copy Image to the clipboard
Save Image As Save graph as an image
Print Graph Print graph
Export Graph CSV Export data points to a CSV file
Curve Context Menu
Icon Name Description
Hide All but This Hide all curves except the selected one
Copy Curve Data Copy curve datapoints to the clipboard
Export Series CSV... Export curve datapoints as CSV
Available Graphs
Name Details
KPI Show sessions sent/received during the displayed time
range
Windows server(s) Performance
Counters
WORKFLOW TREE AND FUNCTIONAL AREA - RUN AND MONITOR TEST 56
User Interface Reference v2
Pages
Transactions
Test Cases
Help Boxes
Graph context menu To show sessions sent/received within a time range:
1.Select the time range to zoom it to a full graph
2.Click "Show sessions in range…"
The sessions will be displayed in the session grid.
Note: The test log is updated with a one minute delay.
To zoom-out one/all step, click Un-Zoom One/All.
To stop/resume time auto-scrolling, scroll to the left/right.
To show hidden curves, click "Unhide".
Other commands:
- Copy, Save, Print Graph Image
- Export Graph datapoints
For more options, right-click a curve.
Graph curve context menu To hide all but the selected curve, click "Hide".
To unhide all curves, in the graph context menu,
click "Unhide".
To copy or export core data, click Copy or Export.
1.4.1.2 Curve Grid
Help Boxes
Curve Grid context menu To expand / collapse curve rows, double-click a graph row
or click on the plus / minus image.
To show/hide a curve on a graph, check /uncheck box on
the corresponding curve row.
WORKFLOW TREE AND FUNCTIONAL AREA - RUN AND MONITOR TEST 57
User Interface Reference v2
To highlight a curve on a graph, click the curve name on the
corresponding curve row.
Context Menu
Name Description
Highlight Curve Highlight the curve on the graph
Highlight All Curves But This Show just this curve on the graph
Unhide All Curves Show all curves on the graph
Curve Grid
Column Description
Visible Check/uncheck the box to show/hide the curve
Parameter The name of the parameter represented by the curve
Color The curve color
Range The scale of the chart axis for this parameter
Min Minimum value of the parameter
Max Maximum value of the parameter
Avg Average value of the parameter
Last Last value of the parameter
Warnings Number of threshold violation warnings
Errors Number of threshold violation errors or missed goals
1.4.1.3 Test Progress Panel
Parameters
Name Description
Time The time elapsed from the beginning of the test.
Users The number of instantiated VUs.
Iterations Started The number of started test iterations.
WORKFLOW TREE AND FUNCTIONAL AREA - RUN AND MONITOR TEST 58
User Interface Reference v2
Name Description
Iterations Ended The number of completed test iterations.
Requests Sent The number of issued requests.
Requests Pending The number of issued requests, which responses are not received yet.
Responses OK The number of received responses excluded errors and timeouts
Errors The number of errors.
Timeouts The number of timeouts.
SQL CE Capacity
used
The percentage of the 4 GB storage limit used to store test data
accumulated up to this point.
If SQL CE capacity used can reach 100%, learn how to Reduce Test
Storage Use.
Help Boxes
Test Progress Panel Test Progress Panel displays test progress.
parameters. For more information, check
Monitoring Test Progress and Health
If SQL CE used capacity used can reach 100%,
learn how to Reduce Test Storage Use
1.4.1.4 Agents and Test Cases Grid
Column Description
Name The Test Case or Agent Name
Users The number of active VUs
Iterations Started The number of started test iterations
Iterations Ended The number of completed test iterations
Requests Sent The number of issued requests
Responses Received The number of received responses
WORKFLOW TREE AND FUNCTIONAL AREA - RUN AND MONITOR TEST 59
User Interface Reference v2
Errors The number of errors
Timeouts The number of timeouts
1.5 Analyze Results
1.5.1 Opening Previous Results
toolbar
Button Action Description
Refresh Previous Results Click to refresh the Previous Results list
Import Result Click to open an SQL CE .sdf file
Open Result Open the selected result in a new tab
Select All
Unselect All
Compare Tests Generate a multi-test report for comparing selected
results
Configure Result Storage
settings
Delete the checked results
Help Boxes
Previous Results The list below displays the Results of previous test runs.
- To refresh the result list, click "Show Previous Results".
- To load the selected Result in a new result tab , click "Open"
or double-click the Result.
- To compare several Results, check boxes next to them
and click "Compare Tests".
- To delete Results, check boxes next to them and click
"Delete".
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 60
User Interface Reference v2
- To rename a Result, right-click and select Rename.
- To change the data storage settings or the amount of
the saved data, click "Configure Result Storage settings".
See Also:
Opening Previous Results
Comparing Tests
Test Result Storage
Property Grid
Property Description
Data Storage
Type
The type of repository storing the result
Result name The name automatically created for every test run. In SQL CE, it's the name of
the .sdf file.
Location SQL Server CE file name
Date The test run date
Size (KB) The SQL Server CE file size
1.5.2 Test Result Tab
Toolbar
Button Action Description
Summary Test Results at-a-glance
Graphs Graphs of Key Performance Indicators and Performance
Counters
Details Details
Errors Errors
VU Activity VU Activity Chart
Waterfall Waterfall Chart
Select Layout Select Layout
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 61
User Interface Reference v2
Button Action Description
Show Sessions Show sessions matching selection criteria
External report Create external report
Back to Previous
Results
Back to Previous Results
Help Boxes
Test Result Views Click a button on the left to select one
of the following test result views:
Summary, Details, Graphs, Error,
VU Activity, Waterfall
In the selected view, right-click for more
options and help information.
See Also:
Test Result Tab
Other Test Result Commands - To select a graph or grid panel layout, click "Select Layout"
- To selected sessions from the Test Log, click "Show Sessions "
- To generate a report, click "External Report"
- To select a Multi-document (default ) or a
Single-document report option, click a drop-down.
See Also:
Query Log
External Reports
1.5.2.1 Graphs
Graph Context Menu
Icon Name
Un-Zoom All
Un-Zoom One
Unhide All Curves
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 62
User Interface Reference v2
Icon Name
Show sessions in range
Copy Image
Save Image As...
Print Graph...
Export Graph CSV...
Graph Curve Context Menu
Icon Name
Hide All But This
Copy Curve Data
Export Curve CSV...
Help Boxes
Graph Context Menu To show sessions related to this page or transaction,
sent/received within a time range:
1.Select the time range to zoom it to a full graph
2.Click "Show sessions in range…"
The sessions will be displayed in the session grid.
To zoom-out one/all step, click Un-Zoom One/All.
To show hidden curves, click "Unhide".
Other commands:
- Copy, Save, Print Graph Image
- Export Graph datapoints
For more options, right-click a curve.
See Also:
Graph Context Menu
Curve Context Menu
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 63
User Interface Reference v2
1.5.2.2 Detail View
Help Boxes
Page Details The "Page Details" grid displays performance
characteristics of each page from the end-user
perspective.
Note: Page response time includes times for loading
performance-impacting requests. It excludes requests
loaded after the page is displayed (e.g. AJAX requests),
as determined by StresStimulus.
See Also:
Page Details
Transaction Details The "Transaction Details" grid displays performance
characteristics of each transaction from the end-user
perspective.
See Also:
Transaction Details
Request Details The "Request Details" grid displays aggregated
performance characteristics of each request
grouped by URL. Time characteristics are averaged.
Request counts are summed.
If a request timed-out and subsequently failed,
it's counted as a timeout.
See Also:
Request Details
Virtual User Details The "VU Details" grid displays statistics
of the test Iterations executed by every VU.
See Also:
VU Details
Test Case Details The "Test Case Details" grid displays
performance characteristics of each test case.
See Also:
Test Case Details
Test Case Group Details The "Test Case Group Details" grid displays
performance characteristics of each test case group.
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 64
User Interface Reference v2
See Also:
Test Case Group Details
Agent Details The "Agent Details" grid displays performance
characteristics of each agent.
See Also:
Agent Details
1.5.2.3 VU Activity
Help Boxes
Activity Chart - Horizontal axle is a timeline.
- Vertical axle shows VUs.
- Horizontal bars, represent a test
iteration executed by a VU.
- To zoom-in to a specific VUs/Iterations range,
select an appropriate rectangular area.
- To zoom-out, right-click and select Un-Zoom.
For more options, right-click a horizontal bar.
Other context menu commands:
- Copy, Save, Print Graph Image
See Also:
Graph Context Menu
Activity Chart Context Menu Horizontal bars, represent a test iteration executed
by a VU.
- To display a waterfall of a selected iteration, click
View Waterfall.
- To compare a waterfall of the selected Iteration with
previously select waterfall, click Compare Waterfalls.
See Also:
Graph Context Menu
Iteration Bar Context Menu
Icon Name Description
View Waterfall Dbl-Click to display a waterfall for this VU/ iteration
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 65
User Interface Reference v2
Icon Name Description
Compare
Waterfalls
Ctrl+Dbl-Click to display this VU/ iteration in a dual waterfall on the
right
Chart Context Menu
Icon Name Description
Un-Zoom Click to fully Un-Zoom this chart
Copy Image Copy Image to the clipboard
Same Image As... Save Graph as an Image
Print Graph... Print Graph
1.5.2.4 Iteration Waterfall
Toolbar
Icon Description
Enter the first VU
Enter the first VU iteration
Check to compare two waterfalls
Enter the second VU
Enter the second VU iteration
Refresh the waterfall charts
Check to enable Zoom/Scroll Synch on the left & right chart
Swap The Charts
Context Menu
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 66
User Interface Reference v2
Icon Name Description
Auto-Sync Check to enable Zoom/Scroll Synch on the left & right chart
Diagonal Scrolling Check to enable Diagonal Scrolling on this chart
Un-Zoom Click to fully Un-Zoom this chart
Copy Image
Save Image As...
Print Graph...
Help Boxes
Waterfall View - Select a VU and iteration.
- To compare two waterfalls, check Compare
and select a second VU and iteration.
- Click Refresh to refresh the charts.
See Also:
Waterfall View
Single Waterfall Chart
Dual Waterfall Chart
Waterfall Chart Commands - To swap the charts, click Swap.
- To turn on/off synchronization of chart
scrolling and zooming, click Sync/Un-sync.
See Also:
Waterfall View
Single Waterfall Chart
Dual Waterfall Chart
1.5.2.5 Query Log
Help Boxes
Query Test Log To display selected replayed sessions from the Test Log in the
session grid, enter selection criteria and click "Show Sessions".
Selection criteria formats and examples:
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 67
User Interface Reference v2
- for VUs, Iterations and Sessions: 1-3, 5, 9;
- for responses with Errors and/or Timeouts: check 1 or 2 boxes;
- to filter by time range, check the box, select Send, Received, or
both, and enter the time range in seconds;
- for Test Cases and Agents: Name1, Name2;
Note:
- leaving textboxes empty will broaden the search.
- retrieving more than 1,000 records, as entered in the
Max Sessions box, can impact performance.
See Also:
Querying Test Log
1.5.3 Page and Transaction Result Tab
Toolbar
Button Action Description
Summary Page/Transaction summary
Performance Page/Transaction response time
Latency Page/Transaction Latency/Server time breakdown
Failures The number of failures on the Page/Transaction
% Failures The percentage of failures on the Page/Transaction
Requests Page/Transaction requests
VU Activity VU Activity Chart
Waterfall Waterfall Chart
Show Sessions Show sessions matching selection criteria
Back to the test result
Help Boxes
Page Result Views Click a button on the left to select one of the
following page result views: Summary,
Performance, Latency, Failures, Failures %,
Requests, VU Activity, Waterfall.
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 68
User Interface Reference v2
In the selected view, right-click for more
options and help information.
- To selected sessions from the Test Log,
click "Show Sessions".
- To go back to the Test Result, click Back.
See Also:
Page & Transaction Result Tabs
Querying Test Log
Transaction Result Views Click a button on the left to select one of the
following page result views: Summary,
Performance, Latency, Failures, Failures %,
Requests, VU Activity, Waterfall.
In the selected view, right-click for more
options and help information.
- To selected sessions from the Test Log,
click "Show Sessions".
- To go back to the Test Result, click Back.
See Also:
Page & Transaction Result Tabs
Querying Test Log
1.5.3.1 Summary View
Help Boxes
Summary View Summary view lists page or transaction basic performance
metrics and failures. It includes subsections that can be
expanded / collapsed by clicking the triangle icon.
See Also:
Summary View
1.5.3.2 Performance View
Help Boxes
Performance View Performance view presents a page or a transaction response
timeline and changes depending on the number of emulated
VUs. It features five curves: the minimum, average and
maximum response time, goal and the number of VUs.
See Also:
Performance View
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 69
User Interface Reference v2
Graph Context Menu To show sessions related to this page or transaction,
sent/received within a time range:
1.Select the time range to zoom it to a full graph
2.Click "Show sessions in range…"
The sessions will be displayed in the session grid.
To zoom-out one/all step, click Un-Zoom One/All.
To show hidden curves, click "Unhide".
Other commands:
- Copy, Save, Print Graph Image
- Export Graph datapoints
For more options, right-click a curve.
See Also:
Graph Context Menu
Graph Curve Context Menu To hide all but the selected curve, click "Hide".
To unhide all curves, in the graph context menu,
click "Unhide".
To copy or export curve data, click Copy or Export.
See Also:
Graph Curve Context Menu
1.5.3.3 Latency View
Help Boxes
Latency View Latency view presents a page or a transaction
response time breakdown between Latency
and Server Time. The latency (or network time)
is a portion of the response time attributed to the
network delays, necessary for server responses
to reach the client.
See Also:
Latency View
1.5.3.4 Failures View
Help Boxes
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 70
User Interface Reference v2
Failure View Failure views helps to analyze the number of page
or transaction failures. The graph presents a timeline
of errors, timeouts, missed goals and their changes
depending on the number of emulated VUs.
See Also:
Failure View
Failure % View Failure views helps to analyze the percentage of page
or transaction failures. The graph presents a timeline
of errors, timeouts, missed goals and their changes
depending on the number of emulated VUs.
See Also:
Failure View
1.5.3.5 Requests View
Help Boxes
Request View The request grid displays aggregated
performance characteristics of each request
related to this page or transaction and grouped
by URL. Time characteristics are averaged.
Request counts are summed.
If a request timed-out and subsequently failed,
it's counted as a timeout.
See Also:
Request View
1.5.3.6 VU Activity View
Activity Chart Context Menu
Icon Name Description
Un-Zoom Click to fully Un-Zoom this chart
Copy Image Copy Image to the clipboard
Same Image As... Save Graph as an Image
Print Graph... Print Graph
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 71
User Interface Reference v2
Iteration Bar Context Menu
Icon Name Description
View Waterfall Dbl-Click to display a waterfall for this VU/ iteration
Compare
Waterfalls
Ctrl+Dbl-Click to display this VU/ iteration in a dual waterfall on the
right
Help Boxes
VU Activity View VU Activity View shows the activity of every VU
during the test. Each row in the chart represents
an individual VU. The row is broken down on
differently colored horizontal bars, each of which
represents single test iteration.
The x-axis displays the time line for the load test run.
See Also:
Page & Transaction Result Tabs
Activity Chart - Horizontal axle is a timeline.
- Vertical axle shows VUs.
- Horizontal bars, represent a test
iteration executed by a VU.
- To zoom-in to a specific VUs/Iterations range,
select an appropriate rectangular area.
- To zoom-out, right-click and select Un-Zoom.
For more options, right-click a horizontal bar.
Other context menu commands:
- Copy, Save, Print Graph Image
See Also:
Graph Context Menu
Activity Chart Context Menu Horizontal bars, represent a test iteration executed
by a VU.
- To display a waterfall of a selected iteration, click
View Waterfall.
- To compare a waterfall of the selected Iteration with
previously select waterfall, click Compare Waterfalls.
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 72
User Interface Reference v2
See Also:
Graph Context Menu
1.5.3.7 Waterfall View
Toolbar
Icon Description
Enter the first VU
Enter the first VU iteration
Check to compare two waterfalls
Enter the second VU
Enter the second VU iteration
Refresh the waterfall charts
Swap The Charts
Check to enable Zoom/Scroll Synch on the left & right chart
Navigate to VU Activity chart
Context Menu
Icon Name Description
Auto-Sync Check to enable Zoom/Scroll Synch on the left & right chart
Diagonal Scrolling Check to enable Diagonal Scrolling on this chart
Un-Zoom Click to fully Un-Zoom this chart
Copy Image
Save Image As...
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 73
User Interface Reference v2
Icon Name Description
Print Graph...
Help Boxes
Waterfall View - Select a VU and iteration.
- To compare two waterfalls, check Compare
and select a second VU and iteration.
- Click Refresh to refresh the charts.
See Also:
Waterfall View
Single Waterfall Chart
Dual Waterfall Chart
Waterfall Chart Commands - To swap the charts, click Swap.
- To turn on/off synchronization of chart
scrolling and zooming, click Sync/Un-sync.
See Also:
Waterfall View
Single Waterfall Chart
Dual Waterfall Chart
1.5.4 Comparing-Tests
Help Boxes
Compare multiple tests Click a button on the left to select
Summary or KPI Graph view.
In the selected view, right-click for more
options and help information.
See Also:
Test Comparison Summary View
KPI Graph Comparison View
Toolbar
WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 74
User Interface Reference v2
Button Action Description
Summary Test Comparison Summary
KPI Graph KPI Graph Comparison
1.6 Workflow Tree Toolbar
Toolbar
Back - Return one step back
Tree View - Toggle to display the Test Case Tree on the left pane
Grid View - Toggle to display the Session Grid on the left pane
Show recorded Test Case sessions in the session grid
Test Wizard
Run - Start a load test
Help Boxes
Workflow Tree Toolbar - Click "Back" to go back one step on the
Workflow Tree.
- Click "Tree View" to display the Test Case Tree
on the left pane.
- Click "Grid View" to display the session Grid
on the left pane.
- Click "Show Recorded" to display Test Case sessions
in the session grid.
Test Wizard / Run Test - Click "Test Wizard". The wizard will guide you
through the major steps of creating, configuring
and running a test.
WORKFLOW TREE AND FUNCTIONAL AREA - WORKFLOW TREE TOOLBAR 75
User Interface Reference v2
- Click "Run" to start the test.
Graphs will display the test results in progress.
After the test completion, select reports from the
Analyze Results section in the Workflow Tree.
See Also:
Starting Test
1.7 Test Wizard
1.7.1 Record Test Case
1.7.2 Configure Test Case
1.7.3 Configure Test
1.7.4 Run Test
1.7.5 Analyze Results
1.7.6 Record Test Case
Help Boxes
Create a Test Case To record a Test Case by navigating through your application,
select the recording source and click "Record".
See Also:
Recording a test case
Browser Recording Settings - Enter the initial URL and select a browser cache option.
- In Private Mode (recommended), browser cache is not used.
- Enter the first transaction name (optional) and click Record
See Also:
Recording with Web Browser
WORKFLOW TREE AND FUNCTIONAL AREA - TEST WIZARD 76
User Interface Reference v2
1.7.7 Configure Test Case
1.7.7.1 Targeted Hosts
Icon Description
Delete the requests to the selected hosts
Add the selected hosts to the Excluded Hosts list and delete requests to these hosts
Show the Excluded Hosts list
Select All
Delete the sessions with the selected content types
Test Case Hosts This list displays hosts targeted in this Test Case.
Toolbars commands:
- Delete requests to the selected hosts from the Test Case.
- Add the selected hosts to the Excluded Hosts list.
Requests to these hosts will be ignored in future recordings.
- Show the Excluded Hosts list.
See Also:
Purging requests to unwanted hosts
1.7.7.2 Content-Types
Icon Description
Delete the sessions with the selected content types
Add the selected content types to the Excluded Content Types list and delete sessions with
these content types
Show the Excluded Content Types list
Select All
Delete the sessions with the selected content types
Test Case Content Types This list displays content types used in this Test Case.
WORKFLOW TREE AND FUNCTIONAL AREA - TEST WIZARD 77
User Interface Reference v2
Toolbars commands:
- Delete sessions with the selected content types from the Test Case.
- Add the selected content types to the Excluded Content Types list.
Sessions with these content types will be ignored in future recordings.
- Show the Excluded Content Types list.
See Also:
Purging sessions with the unwanted content types
1.7.7.3 Autocorrelation
AutoCorrelation Autocorrelation is the automatic modification of requests
issued during a test run to replace recorded values
with corresponding dynamic values received from
the server in the previous responses. Autocorrelation
is necessary to preserve application integrity in dynamic
websites and avoid server errors.
The wizard will now find and configure hidden
autocorrelation parameters.
See Also:
AutoCorrelation
1.7.8 Configure Test
Help Boxes
Load Pattern Load pattern defines dynamics of virtual users (VU)
throughout the test.
See Also:
Load Pattern
Test Duration Set the test completion criteria. After reaching
this condition, the test will stop.
See Also:
Test Duration
WORKFLOW TREE AND FUNCTIONAL AREA - TEST WIZARD 78
User Interface Reference v2
1.7.9 Run Test
Help Boxes
Run Test The wizard will now start the test execution.
For more information, see
Running and Monitoring Test
1.7.10 Analyze Results
Help Boxes
Analyze Results The wizard will now navigate
through the main test results.
For more information, see
Analyzing Results
WORKFLOW TREE AND FUNCTIONAL AREA - TEST WIZARD 79
User Interface Reference v2
2 OBJECT AREA
2.1 Test Case Tree
Page Context Menu Commands
Icon Description
Rename Page
Edit Page
Clone Page
Delete Page
Session Context Menu Commands
Icon Description
Show Session Inspector
Edit Session
Clone Session
Delete Session
Create Response Extractor
Create Req. Un Parameter
Create Req. Header Parameter
Create Response Validator
2.1.1 Upper Toolbar
Toolbar
Icon Description
OBJECT AREA - TEST CASE TREE 80
User Interface Reference v2
Expand All
Collapse All
Edit the selected object
Delete the selected object
Test Case hosts
Dock to Fiddler on the left
Dock to StresStimulus on the right
Help Boxes
Test Case Modification To edit Tree objects or view more details:
- Select an object and click "Edit".
- Double-click a page to navigate to the Test Case Settings grid.
- Double-click a request to display the Session Inspector.
- Double-click an Extractor to navigate to the Extractors section.
- Double-click a parameter to navigate to the Parameters section.
- Double-click a validator to navigate the Validators section.
- To delete an object, selected it in the Test Case Tree and
click "Delete" or hit (Del)
- To delete multiple sessions, select them in the session grid
and hit (Ctrl+Del).
- To add new sessions selected in the session grid, drag and
drop them into the desire position in the Test Case Tree.
- To reposition the selected in the Test Case Tree request or
page, drag and drop it into a new position.
Test Case Tree Commands - To show hosts targeted in this Test Case,
click "Test Case hosts"
- To dock the test case tree to Fiddler on the left
or to StresStimulus on the right, click "Dock…"
2.1.1.1 Test Case Hosts
Toolbar
Button Action
OBJECT AREA - TEST CASE TREE 81
User Interface Reference v2
Delete the requests to the selected hosts
Add the selected hosts to the Excluded Hosts list
Show the Excluded Hosts list
Help Boxes
Test Case Hosts This list displays hosts targeted in this Test Case.
Toolbars commands:
- Delete requests to the selected hosts from the Test Case.
- Add the selected hosts to the Excluded Hosts list.
Requests to these hosts will be ignored in future recordings.
- Show the Excluded Hosts list.
See Also:
Purging requests to unwanted hosts
2.1.2 Lower Toolbar
Toolbar
Button Action
Find Next (F3)
Find Previous (Shift+F3)
Find Sessions by Content ... (Ctrl+F)
Clear Search
Delete the found highlighted sessions
Filter Objects: Click button to show All object; Click drop-down to select sessions type.
Filter Objects: Click button to show Sessions only; Click drop-down to select sessions
type.
Filter Objects drop-down options:
Click to show recorded Primary requests
OBJECT AREA - TEST CASE TREE 82
User Interface Reference v2
Click to show all recorded requests, except images, stylesheets and scripts
Click to show all recorded requests
Click to show recorded requests with errors and warnings
Hide autocorrelation parameter detail
Show autocorrelation parameter details
Help Boxes
Session Search / Filter - To find a URL, start typing in the Search URLs box.
- To find the next/previous URL, click "Find Next/Previous".
- To find and highlight sessions by request/response
content, click Find Sessions by Content or hit (Ctrl+F).
- To clear session highlight, click "Clear Search".
- To delete highlighted sessions, click "Delete highlighted".
- To toggle between showing All Objects and
Sessions Only, click "Filter Objects".
- To select which sessions to display, click the
"Filter Objects" drop-down.
- To show or hide autocorrelation parameter details,
click "Show" or "Hide"
See Also:
Searching Test Case Tree
Filtering Test Case Tree
2.1.3 Session Inspector
Toolbar
Button Action
Unlock for Editing
Save session changes
Split the window at 1/4
Split the window at 1/2
OBJECT AREA - TEST CASE TREE 83
User Interface Reference v2
Split the window at 3/4
Help Boxes
Session Inspector Session Inspector displays:
- Request in the top text box
- Response in the bottom text box
To edit the session content, check
"Unlock for Editing" box.
2.2 Session Grid
Help Boxes
Fiddler Grid To view test sessions in the Fiddler Grid, click the arrow on
the "Show …" split-button on the toolbar above Workflow
Tree and select which sessions to show.
1. "VU number" column displays a VU in the <User XXX> format.
2. Iterations and requests are displayed in the
column "Iter-URL" as < YYY-ZZZ>, where
YYY is an iteration number for the user XXX,
and ZZZ is a request number within the iteration.
3. Replayed Sessions: Primary requests are displayed in
bold-gray. Dependent requests are displayed in gray.
4. To delete selected recorded sessions from a test case,
hit (Ctrl+Del).
OBJECT AREA - SESSION GRID 84
StresStimulus Load Testing Tool User Interface Reference
StresStimulus Load Testing Tool User Interface Reference
StresStimulus Load Testing Tool User Interface Reference
StresStimulus Load Testing Tool User Interface Reference
StresStimulus Load Testing Tool User Interface Reference
StresStimulus Load Testing Tool User Interface Reference
StresStimulus Load Testing Tool User Interface Reference
StresStimulus Load Testing Tool User Interface Reference
StresStimulus Load Testing Tool User Interface Reference
StresStimulus Load Testing Tool User Interface Reference
StresStimulus Load Testing Tool User Interface Reference
StresStimulus Load Testing Tool User Interface Reference

More Related Content

StresStimulus Load Testing Tool User Interface Reference

  • 2. User Interface Reference v2 TABLE OF CONTENTS 1 Workflow Tree and Functional Area ..................4 1.1 Record Test Case ............................................................................................................ 5 1.1.1 Browser Cache.................................................................................................................. 6 1.2 Build Test Case ............................................................................................................... 7 1.2.1 Test Case Settings ............................................................................................................ 7 1.2.2 Authentication.................................................................................................................. 14 1.2.3 Variables ......................................................................................................................... 15 1.2.4 Parameters...................................................................................................................... 25 1.2.5 Response Validators ....................................................................................................... 30 1.2.6 Verify & Auto-config......................................................................................................... 31 1.2.7 Multi Test Cases.............................................................................................................. 36 1.3 Configure Test ............................................................................................................... 39 1.3.1 Load Pattern.................................................................................................................... 40 1.3.2 Test Duration................................................................................................................... 41 1.3.3 Browser Type .................................................................................................................. 42 1.3.4 Network Type .................................................................................................................. 43 1.3.5 Load Agents .................................................................................................................... 44 1.3.6 Monitoring ....................................................................................................................... 46 1.3.7 Result Storage................................................................................................................. 49 1.3.8 Other Options.................................................................................................................. 51 1.3.9 Script Editor..................................................................................................................... 52 1.4 Run and Monitor Test.................................................................................................... 53 1.4.1 Runtime Dashboard......................................................................................................... 53 1.5 Analyze Results............................................................................................................. 60 1.5.1 Opening Previous Results ............................................................................................... 60 1.5.2 Test Result Tab............................................................................................................... 61 1.5.3 Page and Transaction Result Tab ................................................................................... 68 1.5.4 Comparing-Tests............................................................................................................. 74 1.6 Workflow Tree Toolbar.................................................................................................. 75 1.7 Test Wizard .................................................................................................................... 76 1.7.1 Record Test Case............................................................................................................ 76 1.7.2 Configure Test Case........................................................................................................ 76 1.7.3 Configure Test................................................................................................................. 76 1.7.4 Run Test.......................................................................................................................... 76 1.7.5 Analyze Results............................................................................................................... 76 1.7.6 Record Test Case............................................................................................................ 76 1.7.7 Configure Test Case........................................................................................................ 77 WORKFLOW TREE AND FUNCTIONAL AREA - RECORD TEST CASE 1
  • 3. User Interface Reference v2 1.7.8 Configure Test................................................................................................................. 78 1.7.9 Run Test.......................................................................................................................... 79 1.7.10 Analyze Results............................................................................................................... 79 2 Object Area .......................................80 2.1 Test Case Tree............................................................................................................... 80 2.1.1 Upper Toolbar ................................................................................................................. 80 2.1.2 Lower Toolbar ................................................................................................................. 82 2.1.3 Session Inspector............................................................................................................ 83 2.2 Session Grid .................................................................................................................. 84 3 Other Elements ....................................86 3.1 Main Menu...................................................................................................................... 86 3.1.1 Standalone Version ......................................................................................................... 86 3.1.2 Add-on (Integrated) Version ............................................................................................ 91 WORKFLOW TREE AND FUNCTIONAL AREA - RECORD TEST CASE 2
  • 4. User Interface Reference v2 User interface (UI) reference explains the options that appear in StresStimulus windows, dialog boxes and other UI elements. Much of this material is available in the UI. The purpose of this document is to combine this information into a searchable document. Because all topics are hierarchically organized, access to configuration settings, or functions can be quickly located, not just by content, but also by context. For example, search results for the term "think time" point to the page with the following path: Workflow Tree and Functional Area -> Build Test Case -> Test Case Settings While the topic, Test Case Settings, does not explain how to navigate to its functionality, its contexts suggest that user has to click to on the Build Test Case node and then Test Case Settings node of the Workflow Tree to configure page think time. This section includes the help content embedded into StresStimulus and easy accessible as contextual help. StresStimulus help content includes the following information: • Treeviews, toolbars, menus. Every node of a TreeView, a Toolbar button or an item of a menu is presented by its icon, description and a tooltip, if exist. • Help boxes: Virtually every window, dialog box and toolbar has one or several embedded help boxes, each of which pop-up when mouse-over or click a corresponding light-bulb icon. • Property grids. The name of properties and their description is provided for every object displayed in a property grid. • Data grids: Column names and their tooltips are provided for every grid. WORKFLOW TREE AND FUNCTIONAL AREA - RECORD TEST CASE 3
  • 5. User Interface Reference v2 1 WORKFLOW TREE AND FUNCTIONAL AREA This chapter describes every node on the Workflow Tree and corresponding Functional Area. The Navigation tree topics in the documentation hierarchically correspond to the Workflow Tree in the application, so that all topics found here can easily be located. For example, if you search for "Data Generators" in the documentation, just follow the Navigation tree structure in order to find Data Generators section in the application. Workflow Tree includes four top-level nodes each of which match a related testing step: Button Name Action Record Test Case Create a Test Case by navigating your application OR open an existing Test Case Build Test Case Create and configure Test Case Configure Test Configure Test and load level Run and Monitor Test Start a load test and monitor performance … Analyze Results Open test run results and analyze performance metrics WORKFLOW TREE AND FUNCTIONAL AREA - RECORD TEST CASE 4
  • 6. User Interface Reference v2 1.1 Record Test Case Toolbar Icon Action Description Status: Recording Status: Paused Click to Pause recording. Resume Click to Resume recording. Stop Close the browser and set the test case in StresStimulus. Start New Transaction End Transaction Cache Clear browser cache and cookies. Help Boxes Recorder To record a test case: 1. Go to the tested website to start recording. 2. To skip recording of some pages, click "Pause". 3. Complete navigating web pages. 4. Click "Stop" to close the web browser. A new test case will be created. Notes: To configure clearing browser cache, click "Cache" (IE only) See Also: Recording with Web Browser Recording from Other Sources Transactions To create a transaction 1. Enter its name and Goal (optional). 2. Click "Start new transaction". 2. Complete navigating through transaction steps. 3. Click "End transaction". WORKFLOW TREE AND FUNCTIONAL AREA - RECORD TEST CASE 5
  • 7. User Interface Reference v2 See Also: Creating Transactions Transactions To create a transaction, enter its name and complete navigating through transaction steps. Entering a new transaction name will designate the beginning of the subsequent transaction. See Also: Creating Transactions 1.1.1 Browser Cache Help Boxes Clear Browser Cache Options Recorder automatically clears resources from the targeted domains on the list. - Click View/Edit to access the Clear Cache Domain List - Check the "Clear" box to enable automatic cache clearing before recording. - Check the "After recording" box to automatically add new targeted domain to the list - Select what resources you want to clear. See Also: Automatic browser cache clearing Clear Cache Domain List Clear Cache Domain List includes domains, which resources will be cleared from the browser cache before recording test cases. It is recommended to clear browser cache before recording a test case. Clear Cache Domain List is used for automatic cache clearing. You can edit this list. See Also: Automatic browser cache clearing WORKFLOW TREE AND FUNCTIONAL AREA - RECORD TEST CASE 6
  • 8. User Interface Reference v2 1.2 Build Test Case The Build Test Case section includes the following nodes: 1-st Level 2-nd Level Name Description Test Case Settings Configure Test Case, Page and Request Parameters Transactions & Loops Transaction and Loop Settings Authentication Set VU credentials Source Variables Define Variables that can be use for Parameterization Extractors Define rules of extracting values from responses Datasets Add or Edit Dataset Data Generators Add or Edit Data Generator Functions Add or Edit Function Instances Parameters Use Source Variables to parameterize Requests Response Validators Add or edit custom rules to validate responses Verify & Auto Config Verify Test Case, find missing parameters and auto- configure them Multi Test Cases Configure multiple Test Cases 1.2.1 Test Case Settings Toolbar Icon Description WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 7
  • 9. User Interface Reference v2 Expand All Collapse All Create a Transaction starting from the selected request Create a Loop starting from the selected request or transaction Edit the selected Transaction or Lo-op Delete the selected Transaction or Loop Help Boxes Transactions & Loops A transaction is a group of sequential requests or pages, representing a complete business transaction, which is grouped for the purpose of tracking its performance. A Loop is a group of sequential requests or pages, requested multiple times within a test case iteration. Unconditional Loops repeat a specified number of times. Conditional Loops have a condition checked at the end of the loop to determine if the loop should continue or exit. See Also: Transactions Conditional Transactions Loops Test Case Settings The tree displays 5 type of objects: Test Case, Loops, Transactions, Pages and Requests. To change the selected object properties, use the property grid below. Toolbar commands: - Expand / Collapse tree - Create or Edit a transaction or loop - Delete selected object See Also: Test Case, Page and Request properties Test Case Settings Context Menu WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 8
  • 10. User Interface Reference v2 Icon Description Insert Transaction Insert Loop Edit Transaction Show on Tree Show Session(s) Clone Page Delete Page Properties Test Case Properties Property Description Name The Test Case name Description The Test Case description URLs The number of requests in the Test Case Request Size (KB) The size of all requests in the Test Case Response Size (KB) The size of all responses in the Test Case Duration (s) The total of the response times of all pages Default Goal (s) The default goal for all pages in the test case. To remove the default goal, leave the property empty. Default Timeout (s) The default timeout for all pages in the test case. WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 9
  • 11. User Interface Reference v2 Property Description Think Times between pages Select "Recorded" to inject the recorded think time after every page. Select "Constant" to use a constant think time. Select "Random" to randomize think time. Tip: for stress tests, select "Zero". Think Time (s) A constant think time between pages. Min Think Time (s) Minimum think time Max Think Time (s) Maximum think time Delay after the test case Tips: For stress tests, select "Zero". To issue iterations with a certain frequency, select Pacing. Delay (s) A constant delay added after the test case replay Iteration Delay (s) A constant delay between iterations Minimum duration of the test case replay (s) Enter the minimum test case replay duration. If the test case replays faster, the appropriate delay will be added. Cache Control Select "Enabled" to emulate browser caching and session management. Select "Disabled" to emulate browsers with disabled caching (all requests will be sent) and restarting browsers before starting a new iteration (browser sessions will not persist across test iterations). New VU % Percentage of the New vs. Returning VUs. Note: (a).On the 1-st iteration, new VUs will have an empty cache, just as a first time user. All requests will be sent. (b).On the 1-st iteration, returned VUs will have a primed cache. Caching rules for each request will be determined based on server caching headers. (c). On the subsequent iterations, all VUs are treated as returned VUs. VU restarting browsers % Percentage of VUs restarting browsers before starting new iteration. For these users, browser sessions will not persist across the test iterations. Page Properties Page number The page number Host The host WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 10
  • 12. User Interface Reference v2 Path The path Query The query string Title The page title URLs The number of requests in the page P.I. URLs The number of Page performance impacting requests, which exclude AJAX requests presumed to be loaded after the page is displayed Request Size (KB) The size of the page's primary and all dependent requests P.I. Request Size (KB) The size of the page's performance impacting requests Response Size (KB) The size of the page's primary and all dependent responses P.I. Response Size (KB) The size of the page's performance impacting responses Duration (s) The page response time, a time required to receive all requests, excluding the ones downloaded after the page is complete Goal (s) The page expected maximum time for all its responses to come back. Iterations where the page response times exceeded the goal are marked as "missed goal". To remove the goal, leave it blank. Think Time (s) The think time is a delay added at the end of the page to simulate the user’s wait time before requesting the subsequent page Timeout (s) The maximum amount of time for receiving any of the page responses. A page timeout error is triggered if any of the page sessions' response times exceeds the Timeout. When to Request the Page Select "On 1-st iteration" to skip this page (e.g. login) on the subsequent iterations. Select "On last iteration" to request this page (e.g. logout) on the last iteration only, if the test is set to run a specified number of Iterations. Request Properties WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 11
  • 13. User Interface Reference v2 Property Description Host The host Path The path Query The query string Timeout (s) The maximum amount of time for receiving the request. A timeout error is triggered if the session's response time exceeds the Timeout. Caching Rules Select "Not Cached" to always request the session disregarding recorded caching headers. Select "Cached" to never request the session for returning VUs with enabled caching. Select "Normal" to use the recorded caching headers. Transaction Properties Property Description Name The Transaction name Description The Transaction description Goal (s) The transaction's completion time limit. To remove the goal, leave it blank. Think Time (s) The think time is a delay added at the end of the transaction to simulate the user’s wait time before requesting the subsequent page Loop Properties Property Description Name The Loop name Description The Loop description When to run the Loop Select "Always" to always run the loop. Select "Check a condition first" to compare if an extractor matches a string to decide if the loop should run. An Extractor name Select from the drop-down an Extractor to compare with the string Loop Type Select Unconditional for a loop repeated specified number of times. Select Conditional for a loop with an exit condition. WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 12
  • 14. User Interface Reference v2 A string to match Specify a string to compare with the Extractor Run the Loop if Match? Select Yes to run the Loop when the Extractor matches the String, and skip the Loop when the Extractor does not match the String. Select No to skip the Loop when the Extractor matches the String, and run the Loop when the Extractor does not match the String. Number of Repeats (Max.) For an Unconditional Loop, set a number of repeats. For a Conditional Loop, set a maximum number of repeats, which cannot be exceeded. To remove the cap, set "-1". Delay before next Loop (s) The delay before starting the next loop cycle (s) Condition Type A type of the condition that is checked at the end of the conditional loop to determine whether the loop should continue or exit. If the condition is based on finding in an HTTP response a specified text or regular expressions, select "Text Based". If the condition is based on evaluating an extractor, select "Extractor Based". Response to Search Select a response number from the drop-down, where the specified text or regular expressions will be searched. Search String Specify a character string that will be searched in the HTTP response. Search String Type If the search string is a text, select "Text". If the search string is regular expressions, select "Regular Expression" Exit Loop if Found? Select Yes to repeat the loop, when "Search String" is not found, and exit the loop, when "Search String" is found. Select No to repeat the loop, when "Search String" is found and exit the loop when "Search String" is not found. Extractor Name Select from the drop-down an Extractor that will be used in the "Extractor Based" condition. Text to compare Specify a text to compare with the Extractor value in the "Extractor Based" condition. Exit Loop if Match? Select Yes to repeat the loop, when "Text to Compare" does not match the Extractor, and exit the loop, when "Text to Compare" matches the Extractor. Select No to repeat the loop, when "Text to Compare" matches the Extractor, and exit the loop, when "Text to Compare" does not match the Extractor. Run Transaction if Match? Select Yes to run the Transaction when "Text to Compare" matches the Extractor, and skip the Transaction, when "Text to Compare" does not match the Transaction. Select No to skip the Transaction when "Text to Compare" matches WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 13
  • 15. User Interface Reference v2 the Extractor, and run the Transaction, when "Text to Compare" does not match the Transaction. 1.2.2 Authentication Toolbar Button Action Import a .csv file with user credentials Help Boxes Server Authentication Use this grid when the tested website uses Basic, Windows Integrated (e.g. NTLM) or other Kerberos authentication. Enter users’ credentials in the grid or click Import on the toolbar and select a .csv file with the user's credentials. The csv file must have 3 grid columns and no header. Multiple user credential rows are consumed by the VUs in a round-robin order. Note: For Form authentication, in the Datasets section, create an Authentication dataset and use it in the Parameterization section. See Also: Authentication Authentication Columns Column Description Domain Domain to authenticate a VU Username Username to authenticate a VU Password Password to authenticate a VU WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 14
  • 16. User Interface Reference v2 1.2.3 Variables Help Boxes Variables Variables are evaluated and used during the replay: 1. In Parameters, to replace request recorded values. 2. In Actions, for custom processing (coming soon). Supported Variables types: Datasets, Extractors, Data Generators, and Functions See Also: Parameterizing dynamic tests 1.2.3.1 Extractors Toolbar Button Action Expand All Collapse All Create an Extractor for a response selected in the Test Case Tree or the session grid Edit the selected Extractor Show the selected Extractor in the Test Case Tree Show the associated with the Extractor response in the session grid Move the selected Extractor to the selected response Clone the selected Extractor to the selected response(s) Hide autocorrelation parameter details Show autocorrelation parameter details Delete the selected Extractor(s) Help Boxes WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 15
  • 17. User Interface Reference v2 Extractors An Extractor defines a rule of retrieving a value from a response. The value is assigned to a variable with the same name that can be used to parameterize subsequent requests. See Also: Extractors Extractor Properties Property Description URL The URL of the response that will be parsed to extract a value Name The Extractor and its variable name Tip: to use the name from the response viewer below, click "Set the selected text as Extractor Name". Text Before A text delimiter that occurs immediately before the value to be extracted. Tip: In the response viewer below, select a string before the Extractor value and click "Set the selected text as Text Before". If entering manually: for "new line", use n; for "Tab", use t Text After A text delimiter that occurs immediately after the value to be extracted. Tip: In the response viewer below, select a string after the Extractor value and click "Set the selected text as Text After". If entering manually: for "new line", use n; for "Tab", use t Occurrence The occurrence of the matching text to be extracted. The default is 1. Use HTML Decoding To apply HTML-decoding to the Extractor value, select Yes. Example of HTML- decoding: converting "&gt;" to ">". The default is No. Use URL Decoding? To apply URL-decoding to the Extractor value, select Yes. Example of URL- decoding: converting: "%3F" to "?". The default is No. Enforce URL encoding? The default is No. When using an extractor in a parameter, StresStimulus automatically URL encodes its value when necessary. Select Yes to enforce the URL encoding of the extractor's value. Use this option with caution as it can result in double-encoding in a parameter. Returned recorded value The value returned by the Extractor from the recorded response. Description The Extractor description. WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 16
  • 18. User Interface Reference v2 Property Description Regular Expression The regular expression with a single value <val> which will be extracted. Example: w+d="(?<val>.+)" returns a value of the name/value pair, where the name ends with a digit. Header The response header name, selected from the drop-down. Form Field The form field name, selected from the drop-down. XPath query Enter XPath query to extracts a value from a web service XML response. Create Extractor Toolbar Button Action Set the selected text as "Text Before" Set the selected text as "Text After " Set the selected text as an Extractor Name Find Previous (Shift+F3) Find Next (F3) Verify the Extractor Save the Extractor and close this window Save the Extractor and start a New one Help Boxes Create Extractor To Create an Extractor: 1. Select a session in the Test Case Tree or session grid. 2. Select an Extractor Type from the list above. 3. Configure Extractor's properties. 4. Verify the Extractor. WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 17
  • 19. User Interface Reference v2 5. Click Save. For help with specific extractor types, check the light bulb on the right. See Also: How to create an Extractor Edit Extractor To Edit the Extractor: 1. Change Extractor's properties. 2. Verify the Extractor. 3. Click Save. For help with specific extractor types, check the light bulb on the right. See Also: How to create an Extractor Extractor Type Extractor Type defines a text search rule. 1. Text Delimited type extracts a value that occurs in the response between two specified text strings. 2. Regular Expression type extracts a value that is found in the response using regular expression search. 3. Header type extracts a value of a specific response header selected in the Header drop-down. 4. Form Field type extracts a value of a specific web form field selected in the Form Field drop-down. 5. XPath query type extracts a value that is found in an XML response body. See Also: Text Delimited Extractor Regular Expression Extractor Header, Form Field and XPath Extractor Next / Previous Extractor If more than one extractor value can be found in the response, continue clicking Next/Previous Occurrence WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 18
  • 20. User Interface Reference v2 until you find the correct value. Then click Set the Occurrence to properly adjust this property. 1.2.3.2 Datasets Toolbar Button Action New - Create a new Dataset Create Authentication Dataset Import a CSV file as a Dataset Edit the selected Dataset Export the Dataset to a .csv file Delete the selected Dataset Help Boxes Datasets A Datasets is a predefined set of records. Datasets are used in request parameters to: - retrieve a value from the Datasets, - assigned the value to a variable and - replace a recorded value in the request with the variable. The variable name is .< column name>. See Also: Datasets Dataset Commands 1.To add a Dataset, click New to create an empty Dataset that can be populated manually or by pasting values. Or Click "Import a CSV file as a Dataset". 2.To create a dataset for Forms Authentication, click Authentication Dataset. 3.To edit data, select a Dataset from the drop-down. WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 19
  • 21. User Interface Reference v2 4.To edit the Dataset structure, click Edit. 5.Other available operations are Export Data and Delete. Note: - URL encoding in .csv files is supported. - For Basic, Windows Integrated (e.g. NTLM) or other Kerberos authentication use Authentication section. See Also: Datasets Add/Edit Dataset Structure To add a field, enter the field name and click "Add Field". Double-click the field to rename it To reposition, rename or delete a selected field, use the Up, Down, Rename or Remove buttons. To create or rename a Dataset, enter a one-word Dataset name. See Also: Datasets 1.2.3.3 Data Generators Toolbar Button Action New - Create a new Data Generator Verify the value returned by the selected Data Generator Delete the selected Data Generator Help Boxes Data Generators A Data Generator returns a random value assigned to a variable with the same name, which can be used to parameterize requests. WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 20
  • 22. User Interface Reference v2 See Also: Data Generators Data Generator Properties Property Description Name The Data Generator name Min Value The minimum generated value Max Value The maximum generated value Format String A string specifying a format of presenting a value. For details about different data type Formats, check "Formatting Types" in .NET When To Evaluate Set to On Iteration to generate a new value on every iteration (default). Set to On Request to generate a new value on every request Description The Data Generator description Type Select Random to generate random integers between the Min Value and Max Value. Select AutoIncrement to generate sequential integers starting from the Min Value; after the Max Value is reached, the next integer is the Min Value. This is only used for Integer Data Generators. When To Evaluate Select On Iteration to generate a new value on every iteration (default). Select On Request to generate a new value on every request. Create Data Generator Toolbar Button Action Save the Data Generator and close this window Save the Data Generator and create a New one Verify the value returned by the Data Generator Help Boxes WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 21
  • 23. User Interface Reference v2 Data Generator Types Data Generator Type determines what random value will be returned: Integer, Double, Date/Time, GUID or Text. See Also: Data Generators Create Data Generator To Create a Data Generator: 1. Select its Type from the list on the left. 2. Configure its properties. 3. Click Verify to test the returned value. 4. Click Save. See Also: Data Generators Data Generator Properties Property Description Name The Data Generator name MinValue The minimum generated value MaxValue The maximum generated value FormatString A string specifying a format of presenting a value. For details about different data type Formats, check "Formatting Types" in .NET When To Evaluate Set to On Iteration to generate a new value on every iteration (default). Set to On Request to generate a new value on every request Description The Data Generator description 1.2.3.4 Functions Toolbar WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 22
  • 24. User Interface Reference v2 Button Action New - Create a new Function Instance Delete the selected Function Instance Help Boxes Function Instances A Function Instance returns a dynamic value that depends on the function type and the context where the function is called from. The value is assigned to a variable with the same name, which can be used to parameterize a request. The Function Instance is called before the issuing the request. See Also: Functions Properties Property Description Name The Function Instance name Function Type Determines what internal variable or constant will be returned by a Function Instance. The options are: • Agent Name • Agent VU Number • Test Case Name • Agent Iteration Number • URL Number • Agent Request Number • Current DateTime • Current UTC DateTime WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 23
  • 25. User Interface Reference v2 Property Description Use Unix Time format? This property is available in Current Date Time and Current UTC Date Time functions; Select Yes to return the number of milliseconds that have elapsed since Jan-1, 1970. Select No (default) for all other formatting options. Format String A string specifying a format of presenting a value. For details about different data type Formats, check "Formatting Types" in .NET Description The Function Instance description Create Function Instance Toolbar Button Action Save the Function Instance and close this window Save the Function Instance and create a New one Help Boxes Function Types Supported Function Types: - AgentName - name of the current Agent - AgentVUNumber - the current VU number within an Agent’s pool of VUs - TestCaseName - the name of the current Test Case - AgentIterationNumber - the current iteration number executed by an Agent - URLNumber - the current URL number within a test case - AgentRequestNumber - the current request number issued by an Agent from the beginning of the test - Current DateTime - the current date/time - Current UTC DateTime - the current UTC date/time See Also: Functions WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 24
  • 26. User Interface Reference v2 Create Function Instance To Create a Function Instance: 1. Select a Function Type (described in the light bulb on the left). 2. Configure the Function Instance properties. 3. Click Save. See Also: Functions Function Properties Property Description Name The Function Instance name Format String A string specifying a format of presenting a value. For details about different data type Formats, check "Formatting Types" in .NET Description The Function Instance description 1.2.4 Parameters Toolbar (All parameterization controls) Button Action Show the selected Parameter in the Test Case Tree Save Undo to Last Saved (Ctrl+Z) Restore to recorded Switch to Parameterization Editor Switch to Parameterization Grid Switch to Free Format Request Editor WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 25
  • 27. User Interface Reference v2 Hide autocorrelation parameter details Show autocorrelation parameter details Find and Replace parameters' values (Ctrl+F) Text box Find Value Find Next (F3) Find Previous (Shift+F3) Label Session Number Header Parameterization Grid Column Description Header Request header name Recorded Value Request header recorded value Replace with Parameterization expression replacing the recorded value Query String Parameterization Grid Column Description Query Parameter Query string parameter name Recorded Value Query string parameter value Replace with Parameterization expression replacing the recorded value Web Form Parameterization Grid Column Description Form Field Form Field name Recorded Value Request header recorded value WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 26
  • 28. User Interface Reference v2 Column Description Replace with Parameterization expression replacing the recorded value Global Find and Replace ... (Ctrl+F) Help Boxes Parameterization Controls Which Parameterization Control to use: - Parameterization Grid: for configuring name/value pair parameters. - Parameterization Editor: for configuring name/value pair parameters with long values or when Find and Replace is needed. - Free Format Request Editor: for configuring free format requests. See Also: Parameterization Controls Parameterization Grid Parameters change recorded requests during the replay. The new values are derived from Variables. To create a Parameter: 1. Select a request in the Test Case Tree. 2. Select the tab above for a necessary request part: Header, URL and Query, or Body. 3. Click "Replace with" column of the Parameter. 4. In the appeared Variable Picker, select the Variable. WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 27
  • 29. User Interface Reference v2 See Also: Parameterization Grid Parameterization Editor Parameterization editor displays name/value pairs as - Read only blue "name line" and - Editable "value line" To edit data: 1. Optionally select text in a "value line" that should be replaced. 2. Right-click in the "value line" and select a Source Variable in the appeared Variable Picker. - To find a text, enter it into the "...Find" box and click Find Next. For advanced search click Global Find and Replace. - Click Save to save the changes.. - Click Undo to the Last Saved to discard the changes. See Also: Parameterization Editor Free Format Request Editor Select or search the text to be parameterized. Right-click and in the appeared Variable Picker, select a Variable. - To find text, enter it into the "...Find" box and click Find Next. For advanced search click Global Find and Replace. - Click Save to save the changes. - Click Undo to discard the changes. - Click Restore to restore the recorded request. See Also: Free Format Request Editor 1.2.4.1 Variable Picker WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 28
  • 30. User Interface Reference v2 Help Boxes Variable Picker - Select a Variable in the Extractor, Data Generator or Function category. - Or select a Variable as a Dataset, field and databinding method. The Variable will be injected into the request and will replace any selected text. DataBinding Methods The databinding method determines the order of assigning dataset rows to request parameters. - Request-Bound databinding method: Every parameter requested by any VU in any iteration gets a subsequent dataset row. - VU-Bound databinding method: Every VU gets a subsequent dataset row used f+G77 or all its parameters requested in all iterations. - Iteration-Bound databinding method: Every iteration gets a subsequent dataset row used by all VUs in all requested parameters. - Iteration-Request-Bound databinding method: Every subsequently WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 29
  • 31. User Interface Reference v2 requested parameter in every iteration gets the subsequent dataset row shared by all VUs. - VU-Iteration-Bound databinding method: Every VU in every iteration gets a subsequent dataset row used in all its requested parameters. - Parameter- Bound databinding method: Every requested parameter gets a subsequent dataset row shared by all VUs in all iterations. - Random databinding method: Every request parameter gets a random dataset row. 1.2.5 Response Validators Toolbar Button Action Create a Validator for a response selected in the Test Case Tree or the session grid Show the selected Validator in the Test Case Tree Show the response associated with the Validator in the session grid Move the selected Validator to the selected response Clone the selected Validator to the selected response(s) Delete the selected Validator(s) Help Boxes Validators A Validator is a rule of comparing a response with a text pattern. In case of mismatch, a custom error is raised. See Also: Validators WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 30
  • 32. User Interface Reference v2 Validator Properties Property Description URL The URL of the response that will be validated Text to search Text/HTML or regular expression to search for in the response Is text a regular expression? Select Yes if the validation string is a regular expression. Fail If Choose whether to raise the error when the string is Found or Not Found in the response. Scope Set Scope to "Selected Response" to create a Validator for a request selected in the Test Case Tree or in the session grid. Set Scope to "All Responses" to create a global Validator. Description Validator Description 1.2.6 Verify & Auto-config Verify & Auto-config Toolbar Button Action Click to auto-verify the Test Case Drop-down Click drop-down to select Full or Quick verify method. Full Verify with preview of web pages Quick Verify without preview of web pages Enter a session #. Verify will stop after this session. Run Parameter Finder. Verify & Auto-config Help Boxes Verify Test Case When verifying the Test Case: 1. The test runs one time in debug mode. Test WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 31
  • 33. User Interface Reference v2 Configuration settings do not affect this run. 2. Replayed sessions are automatically compared with the corresponding recorded sessions, and deeply analyzed. 3. Errors, warnings, configuration recommendations and other diagnostics are displayed in Session Verification Tree and Extractor Verification Tree. - To change the number of VUs, enter a different VU number. - To select the Full or Quick Verify method, click the drop-down. - To stop Verify earlier, specify a session number, after which to stop. See Also: Verifying Test Case Parameter Finder Parameter Finder finds possible missing extractors and parameters. Creating them can fix configuration errors and make test more realistic. Run Parameter Finder after running "Verify" See Also: Parameter Finder 1.2.6.1 Session Verification Session Verification Toolbar Button Action Expand All Collapse All Compare the selected recorded and replayed sessions Click to show all verified sessions Click to show sessions with errors Click to show sessions with warnings WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 32
  • 34. User Interface Reference v2 Click to show sessions with Notifications Session Verification Help Boxes Session Verification Tree Session Verification Tree matches the recorded and replayed requests in the Test Case. - To compare recorded and replayed sessions selected on the tree, click Compare button. - To view session content, double-click Recorded or Replayed node. See Also: Comparing Sessions Session Filtering To display a subset of sessions, click one of the filtering buttons on the toolbar: - URLs: all sessions; - Errors: sessions with errors related to the test configuration; - Warnings: sessions with issues that may be related to the test configuration; - Notifications: sessions with issues unrelated to the test configuration. 1.2.6.2 Parameter Finder Parameter Finder T oolbar Button Action Click to Group by Extractors Click to Group by Requests Expand All WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 33
  • 35. User Interface Reference v2 Collapse All Parameter Creator: Auto-Configure the Extractor and all Parameters in the selected node Parameter Creator: Auto-Configure all Parameters in the selected node Auto-Configure all Extractors and Parameters Delete the selected Parameter Recommendation Parameter Finder Help Boxes Parameter Finder Tree Parameter Finder Tree displays possible missing extractors and parameters. It has two views 1. Group by Request view displays: a) On each parent node, a request requiring one or several parameters; b) On each child node, a parameter with matching extractor. 2. Group by Extractor view displays: a) On each parent node, an extractor that can be used in one or several parameters. b) On each child node, a parameter using the extractor. Note: To copy selected object content, hit (Ctrl+C). See Also: Parameter Finder Parameter Finder Tab To create Extractors and Parameters discovered by the Parameter Finder, use the "Parameterization Tool". It creates these objects one-by-one. Auto Configurator creates all Extractors and Parameters at-once. See Also: Parameter Creator Auto-Configurator 1.2.6.3 Extractor Verification Extractor Verification T oolbar WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 34
  • 36. User Interface Reference v2 Button Action Show the selected Extractor in the Test Case Tree Show the associated with the Extractor response in the session grid Delete the selected extractors and associated parameters Extractor Statuses Status Description The extractor is OK. The extractor is not found in the response. The extractor's value is not used in the Test Case. The recorded and replayed extractors are the same. Extractor Verification Help Boxes Extractor Verification Tree The Extractor Verification Tree is generated When running "Verify". It displays extractor values and the following exceptions: - The extractor is not found in the response. - The extractor's value is not used in the Test Case. - The recorded and replayed values are the same. The extractors with exceptions are automatically checked for easy removal, as some of them may be unnecessary. See Also: Verifying Test Case Extractor Verification Properties Property Description Extractor name The name of the extractor WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 35
  • 37. User Interface Reference v2 Value The value returned by the extractor during the verify 1.2.7 Multi Test Cases Toolbar Button Action Open a session file as a Test Case Import Test Cases from another Test Click to view the selected Test Case and unlock it for changes Clone the selected Test Case Delete the selected Test Case Export Test Case as an HTTP archive (.har) Help Boxes Multi-Test Cases Multi Test Cases are used to emulate different categories of users. - Test Cases are executed concurrently. - VUs are distributed between the Test Cases proportionate to their Mix Weight properties. - To start configuring or reviewing the selected Test Case, double-click it or click "Click to view …" button on the toolbar. After that the entire Build Test Case section on the Workflow Tree will be associated with this Test Case. Note: Selecting the Test Case as Current, does not impact concurrent execution of multi-test cases. See Also: Multiple Test Cases Editing and Deleting Test Case WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 36
  • 38. User Interface Reference v2 Multi-Test Case commands - To create a new Test Case, click "Record" or "Open session file" and select an .saz or .har file. - To import Test Cases from another Test, click Import. - To clone the selected Test Case, click "Clone Test Case". - To delete the selected Test Case, click Delete. - To change the properties of the selected Test Case, modify them in the property grid below. See Also: Editing and Deleting Test Case Test Case Properties Property Description Name The Test Case name. Description The Test Case description. Mix Weight The relative frequency (in units) of the Test Case replays in the mix. Every VU is assigned to a specific Test Case selected in a round-robin order, while skipping some of them to achieve the VU distribution corresponding to the mix weights. URLs The number of requests in the Test Case. Think Times between pages Select "Recorded" to inject the recorded think time after every page. Select "Constant" to use a constant think time. Select "Random" to randomize think time. Tip: for stress tests, select "Zero". Delay after the test case Tips: For stress tests, select "Zero". To issue iterations with a certain frequency, select Pacing. Cache Control Select "Enabled" to emulate browser caching and session management. Select "Disabled" to emulate browsers with disabled caching (all requests will be sent) and restarting browsers before starting a new iteration (browser sessions will not persist across test iterations). 1.2.7.1 Test Case Groups Toolbar Button Action Create a Test Case Group WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 37
  • 39. User Interface Reference v2 Edit the selected Test Case Group Delete the Test Case(s) from the Test Case Group Help Box Test Case Group When at least one Test Case Group is created, Sequential-Concurrent TC mixing model is used: - Test cases in a TC Group are executed sequentially. - Multiple TC Groups are executed concurrently. - VUs are distributed between the TC Groups proportionate to their Mix Weight properties. - Only test cases included in TC Group(s) are executed. To go back to the Concurrent TC mixing model, delete all TC Groups. See Also: Sequential Test Case Groups Test Case Groups Dialog Commands Button Action Add Test Case(s) to the Test Case Group Delete Test Case(s) from the Test Case Group Move Test Case Up for earlier execution Move Test Case Down for later execution Test Case Group Properties Property Description Name The Test Case Group name Description The Test Case Group description WORKFLOW TREE AND FUNCTIONAL AREA - BUILD TEST CASE 38
  • 40. User Interface Reference v2 Mix Weight The relative frequency (in units) of the Test Case Group replays in the mix. Every VU is assigned to a specific Test Case Group selected in a round-robin order, while skipping some of them to achieve the VU distribution corresponding to the mix weights. Cache Control Select "Enabled" to emulate browser caching and session management. Select "Disabled" to emulate browsers with disabled caching (all requests will be sent) and restarting browsers before starting a new iteration (browser sessions will not persist across test iterations). New VU % Percentage of the New vs. Returning VUs. Note: (a).On the 1-st iteration, new VUs will have an empty cache, just as a first time user. All requests will be sent. (b).On the 1-st iteration, returned VUs will have a primed cache. Caching rules for each request will be determined based on server caching headers. (c). On the subsequent iterations, all VUs are treated as returned VUs. VU restarting browsers % Percentage of VUs restarting browsers before starting new iteration. For these users, browser sessions will not persist across the test iterations. 1.3 Configure Test The Configure Test section includes the following nodes: Load Pattern Configure load Test Duration Configure Test completion criteria Browser Type Configure Browser Mix Network Type Configure Network Mix Load Agents Configure Load Agents Server Monitoring Configure servers' performance monitoring Result Storage Configure Result Storage settings and the amount of saved data Other Options Advanced Options Script Editor Modify Test Script Help Boxes WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 39
  • 41. User Interface Reference v2 Configure Test Navigate through every "Configure Test" item and select desired test parameters. Property Grid Property Description Test Run Name A one-word Test Run Name used as a suffix following a time-stamp of the next test run displayed in the "Analyze Results" section. If the Test Run Name is empty, the test file name is used as a suffix. Test Description Test Description included into the test result Summary view 1.3.1 Load Pattern Help Boxes Load Pattern Load pattern defines dynamics of virtual users (VU) throughout the test. See Also: Load Pattern Load Pattern Properties Property Description Load Pattern Select "Steady Load" to keep a steady number of VUs. Select "Step Load" to ramp-up the number of VUs on every step. Number of VU The constant number of VUs emulated throughout the test. Start VU The initial number of VUs in the beginning of the test. Step VU Increase The number of VUs added on every step. Step Duration (s) The time interval between increasing VU count. Max VU Maximum VU count - Note: Test can complete before reaching "Max VU" if the Test Duration is not long enough. WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 40
  • 42. User Interface Reference v2 Over (s) The amount of time taken in the beginning of each step to gradually add VUs. For instant increase, use zero. 1.3.2 Test Duration Help Boxes Test Duration Set the test completion criteria. After reaching this condition, the test will stop. See Also: Test Duration Test Duration Properties Properties Description Test Completion Condition Select the test completion condition from the drop-down. The options are: Number of Iterations; Run Duration; Reaching max VUs. How to count iterations Select "Per VU" to set the iteration limit for each VU. Select "Total" to set the overall iteration limit. Max. iterations After the specified test completion condition is reached, depending on this selection, test will stop, will wait until all pending responses are received, or will wait for iterations to complete. Load generation time (hh:mm:ss) Enter the duration of load generation, after which no request will be issued. After the Completion Condition is reached After the specified test completion condition is reached, depending on this selection, test will stop, will wait until all pending responses are received, or will wait for iterations to complete. The options are: Wait for responses; Stop the test; Wait for iterations to complete Warm-up time (s) The warm-up period at the beginning of the test is necessary to prepare server for normal working conditions. During the warm-up period the number of VU is gradually ramped-up, server cache is populated, and necessary modules are loaded into the memory. During the warm-up period performance metrics are not collected. WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 41
  • 43. User Interface Reference v2 1.3.3 Browser Type Toolbar Button Action Add - Click to add a browser to the mix - Then configure its settings Delete - Click to delete the selected browser from the mix Help Boxes Browser Type Configure web browser settings. If necessary, add more web browsers to the mix. Tip: How StresStimulus emulates web browsers: (a) It maintains the configured connections limits. (b) It injects the appropriate user-agent string into the requests. (c) It maintains the browser mix distribution, if more than one browser is selected. See Also: Browser Settings Browser Properties Property Description Browser Type Select a web browser or "Custom" from the drop-down. Supported browser types: • IE11, IE10, IE9, IE8, IE7, IE6 • Firefox • Chrome • Opera • Safari • Non-browser application WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 42
  • 44. User Interface Reference v2 • Custom Mix Weight Relative frequency (in units or percents) of using this browser by VUs. Replace User- Agent string Select "True", to use a User-Agent string of the selected browser type, instead of the recorded string. Select "False", to keep the recorded string. User Agent If replacing the recorded User-Agent string, enter a custom string. Connection limit per host To set custom browser performance, enter the maximum number of TCP connections per host. Connection limit per proxy To set custom browser performance, enter the maximum number of TCP connections across all hosts. 1.3.4 Network Type Toolbar Button Action Add - Click to add a network to the mix - Then configure its settings Delete - Click to delete the selected network from the mix Help Boxes Network Type Configure network settings. If necessary, add more networks to the mix. Tip: Network type other than LAN is emulated by injecting a certain wait time into every request and response, weighted to its size and the network type bandwidth. See Also: Network Settings Network Properties Property Description WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 43
  • 45. User Interface Reference v2 Network Type Select a network type or "Custom" from the drop-down. Mix Weight Relative frequency of using this network by VUs Upload Bandwidth (kbps) Enter the upload bandwidth (kbps) Download Bandwidth (kbps) Enter the download bandwidth (kbps) 1.3.5 Load Agents Toolbar Button Action Add a Load Agent connection Edit the selected Load Agent connection Test connections to the Load Agents with non-zero VU weights Reset the Selected Agent Delete the selected Load Agent connection Help Boxes Load Agents Load Agents are computers emulating virtual users in the distributed test, orchestrated by this controller. To create a Load Agent: 1. On a remote computer, install StresStimulus, and in the StresStimulus menu -> Agent Options, enable Agent Mode. 2. On this computer, add a connection to the Load Agent. See Also: Attaching Agents to Controller Load Agents Load Agents are computers emulating virtual users in the distributed test, orchestrated by this controller. WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 44
  • 46. User Interface Reference v2 To create a Load Agent: 1. On a remote computer, install StresStimulus, and in the StresStimulus menu -> Agent Options, enable Agent Mode. 2. On this computer, add a connection to the Load Agent. To set the portion of the total number of VUs on this Load Agent, change its mix weight property. To set the total number of VUs in the test, navigate to the Load Pattern section. See Also: User-Agent Attaching Agents to Controller Load Agent Properties Property Description Agent Name The Load Agent Name StresStimulus Version The version of StresStimulus installed on the agent. Host / IP Address The Load Agent Host. Enter a network computer name or IP address without "//". Example: AGENT1 or 10.2.2.169 Mix Weight Relative number of VUs (in units or percents) emulated on the Load Agent. To disable the Load Agents, set its mix weight to zero. Starting Thread Count Starting thread count is the number of threads created automatically when test is launched. If more threads are needed, the load engine will gradually create more threads while checking available system resources. Increasing the starting threads can increase load engine performance, but it also can overload systems with limited resources. VUs Constant number of VUs. (Read only) Start VUs Starting number of VUs. (Read only) Step VU Increase VU Step increase if the Step Load Pattern is used. (Read only) Max VUs Maximum VUs if the Step Load Pattern is used. (Read only) Username Username to access Remote Agent. WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 45
  • 47. User Interface Reference v2 Password Password to access Remote Agent. 1.3.6 Monitoring 1.3.6.1 Windows Servers and Agents Toolbar Button Action Add A Machine To Monitor Edit The Performance Counters Delete Selected Objects Fine Previous Find Next Help Boxes Server or Agent Monitoring During the test run you can monitor multiple performance counters on the web, application and database servers, as well as agents. Real-time graphs and performance values will be displayed on the real-time dashboards and in the performance reports. - Click "Add" to add a Server with default set of counters. - Click "Edit" to add or delete performance counters to the selected server. - Click "Delete" to delete the selected object(s). See Also: Windows Servers Monitoring Linux/UNIX Servers Monitoring Threshold Rules Add a Machine for Monitoring Enter a machine IP address or computer name without "//". Example: 10.2.2.169 or WEB_SRV5 WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 46
  • 48. User Interface Reference v2 Property Grid Name Description Machine A network computer IP address or computer name without "//"' Domain Network Domain Name UserName User Name on the network computer Password Password Category Performance Counter Category Counter Performance Counter Name Instance Performance Counter Instance Enable Threshold? Select Yes to enable threshold. Default is No. Warning Threshold Enter Warning Threshold value Critical Threshold Enter Critical Threshold value Alert if Over? Select Yes to indicate that exceeding a threshold is a problem. Select No to indicate that falling below a threshold is a problem. Add Performance Counters Help Boxes Add Windows Server Performance Counters 1. Select a performance object, counter, and instance. 2. Click "Add" to add it to the New Counter List. Note: The performance counters are the same as in Windows Perfmon application. See Also: Windows Server Monitoring Add Agent Performance Counters 1. Select the local or remote Agent. 2. For remote Agents, enter an IP address or computer name without "//". Example: 10.2.2.169 or WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 47
  • 49. User Interface Reference v2 WEB_SRV5 3. Select a performance object, counter, and instance. 4. Click "Add" to add it to the New Counter List. See Also: Windows Server Monitoring New Counter List Highlight a counter below to see its description. Click "Delete" to remove the highlighted counters from the New Counter List. Click "Save" to add the New Counter List to the Test. See Also: Server Monitoring 1.3.6.2 Linux/UNIX Servers Help Boxes Add Linux/Unix SNMP Performance Counters 1. Enter a host IP address or name. 2. Change Community, if necessary. 3. Select a counter from the drop-down list. 4. To add counters which are not on the list, enter OID. 5. Enter or edit the counter name. 6. Click Test to test the counter. 7. Click "Add" to add it to the New Counter List. See Also: Linux/UNIX Servers Monitoring Listed SNMP performance counters —CPU Counters — Percentage of user CPU time Percentage of system CPU time Percentage of idle CPU time — Memory counters— Total swap size Available swap space Total RAM Total RAM free WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 48
  • 50. User Interface Reference v2 Total RAM buffered Total cached memory 1.3.7 Result Storage Properties Property Description How much data to store Select All to store the fullest dataset. Select Partial to store all data except the content of the individual HTTP sessions. If select None, no data will be stored in the database, and only the last test result will be available until StresStimulus is closed. Data Storage Select the test result data storage from the drop-down. Note: SQL Server CE capacity is limited to 4 GB. SQL Server connection string Click … to enter the SQL Server connection information in the pop-up window. Purge request bodies Purging bodies of test sessions’ requests saves memory. Select All, to purge all bodies. Select None, to keep all bodies. Select Non-Errors to keep all bodies with Errors Purge response bodies Purging bodies of test sessions’ responses saves memory. Select All, to purge all bodies. Select None, to keep all bodies. Select Non-Errors to keep all bodies with Errors. Select Static Mime Types, to purge bodies of images and other static resources. Save sessions from agents? In distributed tests with SQL Server CE-based storage, the content of the sessions generated on the agents is stored on the agents. Select Yes, to copy this content to the controller. This will allow generating waterfall diagrams for VUs emulated on the agent. Select No, to reduce the traffic between agents and controller when the network bandwidth is limited. Help Boxes Result Storage Test results are stored in a database. Configure Data Storage type, and amount of data to store See Also: Test Result Storage Connection Settings Click Create/Check DB to create a new database or verify connection to the existing database. WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 49
  • 51. User Interface Reference v2 Click OK to set the database as data storage for the Test. Click Cancel to go back without changes. See Also: Test Result Storage 1.3.7.1 Test Pass/Fail Qualification Help Boxes Test Pass/Fail Configuration You can configure several test quality criteria. When at least one of such criteria missed, the test is qualified as Failed. To create a test quality criteria, in the property grid enable a Pass / Fail condition and specify its acceptable limit. See Also: Configuring Test Pass/Fail Qualification Property Grid Property Description Page Goal Misses Enter Yes if Page Goal Misses is subject to the test's Pass / Fail condition. Page Goal Threshold Enter % of Page Goal Misses that triggers the Fail condition. Enter 0 to fail the test with a single Page Goal violation. Transaction Goal Misses Enter Yes if Transaction Goal Misses is subject to the test's Pass / Fail condition. Transaction Goal Threshold Enter % of Transaction Goal Misses that triggers the Fail condition. Enter 0 to fail the test with a single Transaction Goal violation. Request Errors Enter Yes if Request Errors are subject to the test's Pass / Fail condition. Request Error Threshold Enter % of Request Errors that triggers the Fail condition. Enter 0 to fail the test with a single Request Error violation. Request Timeouts Enter Yes if Request Timeouts are subject to the test's Pass / Fail condition. WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 50
  • 52. User Interface Reference v2 Property Description Request Timeout Threshold Enter % of Request Timeouts that triggers the Fail condition. Enter 0 to fail the test with a single Request Timeout violation. 1.3.8 Other Options Properties Property Description Graph sample rate (s) Enter how often the performance counters are read and graph are refreshed. Recommended value is 10s with agents and 5s without the agents. Increase "Sample Rate" for long tests. Pre-run Command Line A command line to execute before the test starts. Pre-run Command Timeout A timeout limit for the pre-run command to complete. MIME Types requested sequentially Click the drop-down and enter MIME types whose requests must be issued only after receiving all previous responses (sequentially). Some MIME types (e.g. Text/HTML) are always requested sequentially. You can enter additional MIME types to prevent dependent requests of these types from being requested in parallel with other dependent requests on a page. Separate multiple entries by ",". For example, enter "image,video" to request all images and videos sequentially; enter "video/mp4" to Request MP4 video sequentially. Enable Dynatrace integration? Select Yes to add to each issued request the x-dynaTrace header. Help Boxes Other Options For information about properties in this section, check the following sources: Pre-run command line Dynatrace Tntegration WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 51
  • 53. User Interface Reference v2 1.3.9 Script Editor Toolbar Button Action Save Script (Ctrl+S) Save and Exit Script Editor Validate against SSScript XSD Show XSD, the SSScript schema document Cut (Ctrl+X) Copy (Ctrl+C) Paste (Ctrl+V) Undo (Ctrl+Z) Redo (Ctrl+Y) Find (Ctrl+F) Find Next (F3) Find Previous (Shift+F3) Bookmark (Ctrl+F2) Bookmark Next (F2) Bookmark Previous (Shift+F2) Help Boxes Script Editor Script is an XML representation of the Test Object Model (TOM). Test modifications can be completed by editing the script or by changing the corresponding settings in the UI. SSScript XSD . SSScript XSD is an XML schema of StresStimulus Scripts. It is used to validate WORKFLOW TREE AND FUNCTIONAL AREA - CONFIGURE TEST 52
  • 54. User Interface Reference v2 Test scripts and to find errors. SSScript XSD is provided here as a reference for script development 1.4 Run and Monitor Test Click Run Test in the Workflow Tree. The following MessageBox will appear . • Click Run and Monitor Test to start test normally. • Click Debug to run the test in debug mode. All replayed sessions will be displayed in the session grid and response bodies will not be purged. • Click Cancel to go back. 1.4.1 Runtime Dashboard Toolbar Button Action Stop Click Stop to Abort the Test Run Click Pause to suspend the Test WORKFLOW TREE AND FUNCTIONAL AREA - RUN AND MONITOR TEST 53
  • 55. User Interface Reference v2 Click Resume to resume the Test Click to add VUs to the Test Skip pending requests (trigger a timeout) and continue the test Retrieve sessions from the Test Log (delayed) Health Monitor - Normal Health Monitor - High Load. CPU utilization approaching exceeds the acceptable range Health Monitor - Overloaded. Stop unessential processes or reduce the number of VUs Select Graph Layout Graph Layouts Option Description One Graph Two Horizontal Panels Two Vertical Panels Three Horizontal Panels Three Vertical Panels Three Panels Oriented Left Three Panels Oriented Right Three Panels Oriented Top Three Panels Oriented Bottom Four Horizontal Panels Four Vertical Panels Four Panels Help Boxes WORKFLOW TREE AND FUNCTIONAL AREA - RUN AND MONITOR TEST 54
  • 56. User Interface Reference v2 Graphs The Graphs display instant performance characteristics and performance counter's data, plotted with the frequency defined by the Sample Rate period. - To select the graph panel layout, click "Select Graph Layout" - To select which graph to display in a graph panel, click the drop-down above it. For more graph commands, right click a graph. See Also: Runtime Dashboard Agents / Test Cases Progress Grid Test Run Commands - To stop the test, click "Stop". - To Pause/Resume test run, click "Pause/Resume". - To increase VU count on demand, set the VU adjustment value and click "+". - To abandon pending requests, click "Skip". Note: - VU Adjustment works with a "Steady Load" pattern only. See Also: Runtime Dashboard Controlling the Test Test Engine Health Status Help Box – Test Engine Health Status@@ For accurate load testing, CPU utilization should not exceed 85%. - Green: Normal. CPU Utilization is under 85%. - Yellow: High Load. CPU utilization is 85-95% and is approaching the acceptable limit. - Red: Overloaded. CPU utilization exceeds 95%. Metrics accuracy can be impaired. Stop unessential processes or reduce the number of VUs. See Also: Monitoring Test Progress and Health WORKFLOW TREE AND FUNCTIONAL AREA - RUN AND MONITOR TEST 55
  • 57. User Interface Reference v2 1.4.1.1 Graphs Graph Context Menu Icon Name Description Un-Zoom One Undo one Zoom Un-Zoom All Full zoom-out Unhide All Curves Maximize Graph Switch this graph to the one panel layout Show sessions in range Show sessions sent/received during the displayed time range (1 minute delayed) Copy Image Copy Image to the clipboard Save Image As Save graph as an image Print Graph Print graph Export Graph CSV Export data points to a CSV file Curve Context Menu Icon Name Description Hide All but This Hide all curves except the selected one Copy Curve Data Copy curve datapoints to the clipboard Export Series CSV... Export curve datapoints as CSV Available Graphs Name Details KPI Show sessions sent/received during the displayed time range Windows server(s) Performance Counters WORKFLOW TREE AND FUNCTIONAL AREA - RUN AND MONITOR TEST 56
  • 58. User Interface Reference v2 Pages Transactions Test Cases Help Boxes Graph context menu To show sessions sent/received within a time range: 1.Select the time range to zoom it to a full graph 2.Click "Show sessions in range…" The sessions will be displayed in the session grid. Note: The test log is updated with a one minute delay. To zoom-out one/all step, click Un-Zoom One/All. To stop/resume time auto-scrolling, scroll to the left/right. To show hidden curves, click "Unhide". Other commands: - Copy, Save, Print Graph Image - Export Graph datapoints For more options, right-click a curve. Graph curve context menu To hide all but the selected curve, click "Hide". To unhide all curves, in the graph context menu, click "Unhide". To copy or export core data, click Copy or Export. 1.4.1.2 Curve Grid Help Boxes Curve Grid context menu To expand / collapse curve rows, double-click a graph row or click on the plus / minus image. To show/hide a curve on a graph, check /uncheck box on the corresponding curve row. WORKFLOW TREE AND FUNCTIONAL AREA - RUN AND MONITOR TEST 57
  • 59. User Interface Reference v2 To highlight a curve on a graph, click the curve name on the corresponding curve row. Context Menu Name Description Highlight Curve Highlight the curve on the graph Highlight All Curves But This Show just this curve on the graph Unhide All Curves Show all curves on the graph Curve Grid Column Description Visible Check/uncheck the box to show/hide the curve Parameter The name of the parameter represented by the curve Color The curve color Range The scale of the chart axis for this parameter Min Minimum value of the parameter Max Maximum value of the parameter Avg Average value of the parameter Last Last value of the parameter Warnings Number of threshold violation warnings Errors Number of threshold violation errors or missed goals 1.4.1.3 Test Progress Panel Parameters Name Description Time The time elapsed from the beginning of the test. Users The number of instantiated VUs. Iterations Started The number of started test iterations. WORKFLOW TREE AND FUNCTIONAL AREA - RUN AND MONITOR TEST 58
  • 60. User Interface Reference v2 Name Description Iterations Ended The number of completed test iterations. Requests Sent The number of issued requests. Requests Pending The number of issued requests, which responses are not received yet. Responses OK The number of received responses excluded errors and timeouts Errors The number of errors. Timeouts The number of timeouts. SQL CE Capacity used The percentage of the 4 GB storage limit used to store test data accumulated up to this point. If SQL CE capacity used can reach 100%, learn how to Reduce Test Storage Use. Help Boxes Test Progress Panel Test Progress Panel displays test progress. parameters. For more information, check Monitoring Test Progress and Health If SQL CE used capacity used can reach 100%, learn how to Reduce Test Storage Use 1.4.1.4 Agents and Test Cases Grid Column Description Name The Test Case or Agent Name Users The number of active VUs Iterations Started The number of started test iterations Iterations Ended The number of completed test iterations Requests Sent The number of issued requests Responses Received The number of received responses WORKFLOW TREE AND FUNCTIONAL AREA - RUN AND MONITOR TEST 59
  • 61. User Interface Reference v2 Errors The number of errors Timeouts The number of timeouts 1.5 Analyze Results 1.5.1 Opening Previous Results toolbar Button Action Description Refresh Previous Results Click to refresh the Previous Results list Import Result Click to open an SQL CE .sdf file Open Result Open the selected result in a new tab Select All Unselect All Compare Tests Generate a multi-test report for comparing selected results Configure Result Storage settings Delete the checked results Help Boxes Previous Results The list below displays the Results of previous test runs. - To refresh the result list, click "Show Previous Results". - To load the selected Result in a new result tab , click "Open" or double-click the Result. - To compare several Results, check boxes next to them and click "Compare Tests". - To delete Results, check boxes next to them and click "Delete". WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 60
  • 62. User Interface Reference v2 - To rename a Result, right-click and select Rename. - To change the data storage settings or the amount of the saved data, click "Configure Result Storage settings". See Also: Opening Previous Results Comparing Tests Test Result Storage Property Grid Property Description Data Storage Type The type of repository storing the result Result name The name automatically created for every test run. In SQL CE, it's the name of the .sdf file. Location SQL Server CE file name Date The test run date Size (KB) The SQL Server CE file size 1.5.2 Test Result Tab Toolbar Button Action Description Summary Test Results at-a-glance Graphs Graphs of Key Performance Indicators and Performance Counters Details Details Errors Errors VU Activity VU Activity Chart Waterfall Waterfall Chart Select Layout Select Layout WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 61
  • 63. User Interface Reference v2 Button Action Description Show Sessions Show sessions matching selection criteria External report Create external report Back to Previous Results Back to Previous Results Help Boxes Test Result Views Click a button on the left to select one of the following test result views: Summary, Details, Graphs, Error, VU Activity, Waterfall In the selected view, right-click for more options and help information. See Also: Test Result Tab Other Test Result Commands - To select a graph or grid panel layout, click "Select Layout" - To selected sessions from the Test Log, click "Show Sessions " - To generate a report, click "External Report" - To select a Multi-document (default ) or a Single-document report option, click a drop-down. See Also: Query Log External Reports 1.5.2.1 Graphs Graph Context Menu Icon Name Un-Zoom All Un-Zoom One Unhide All Curves WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 62
  • 64. User Interface Reference v2 Icon Name Show sessions in range Copy Image Save Image As... Print Graph... Export Graph CSV... Graph Curve Context Menu Icon Name Hide All But This Copy Curve Data Export Curve CSV... Help Boxes Graph Context Menu To show sessions related to this page or transaction, sent/received within a time range: 1.Select the time range to zoom it to a full graph 2.Click "Show sessions in range…" The sessions will be displayed in the session grid. To zoom-out one/all step, click Un-Zoom One/All. To show hidden curves, click "Unhide". Other commands: - Copy, Save, Print Graph Image - Export Graph datapoints For more options, right-click a curve. See Also: Graph Context Menu Curve Context Menu WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 63
  • 65. User Interface Reference v2 1.5.2.2 Detail View Help Boxes Page Details The "Page Details" grid displays performance characteristics of each page from the end-user perspective. Note: Page response time includes times for loading performance-impacting requests. It excludes requests loaded after the page is displayed (e.g. AJAX requests), as determined by StresStimulus. See Also: Page Details Transaction Details The "Transaction Details" grid displays performance characteristics of each transaction from the end-user perspective. See Also: Transaction Details Request Details The "Request Details" grid displays aggregated performance characteristics of each request grouped by URL. Time characteristics are averaged. Request counts are summed. If a request timed-out and subsequently failed, it's counted as a timeout. See Also: Request Details Virtual User Details The "VU Details" grid displays statistics of the test Iterations executed by every VU. See Also: VU Details Test Case Details The "Test Case Details" grid displays performance characteristics of each test case. See Also: Test Case Details Test Case Group Details The "Test Case Group Details" grid displays performance characteristics of each test case group. WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 64
  • 66. User Interface Reference v2 See Also: Test Case Group Details Agent Details The "Agent Details" grid displays performance characteristics of each agent. See Also: Agent Details 1.5.2.3 VU Activity Help Boxes Activity Chart - Horizontal axle is a timeline. - Vertical axle shows VUs. - Horizontal bars, represent a test iteration executed by a VU. - To zoom-in to a specific VUs/Iterations range, select an appropriate rectangular area. - To zoom-out, right-click and select Un-Zoom. For more options, right-click a horizontal bar. Other context menu commands: - Copy, Save, Print Graph Image See Also: Graph Context Menu Activity Chart Context Menu Horizontal bars, represent a test iteration executed by a VU. - To display a waterfall of a selected iteration, click View Waterfall. - To compare a waterfall of the selected Iteration with previously select waterfall, click Compare Waterfalls. See Also: Graph Context Menu Iteration Bar Context Menu Icon Name Description View Waterfall Dbl-Click to display a waterfall for this VU/ iteration WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 65
  • 67. User Interface Reference v2 Icon Name Description Compare Waterfalls Ctrl+Dbl-Click to display this VU/ iteration in a dual waterfall on the right Chart Context Menu Icon Name Description Un-Zoom Click to fully Un-Zoom this chart Copy Image Copy Image to the clipboard Same Image As... Save Graph as an Image Print Graph... Print Graph 1.5.2.4 Iteration Waterfall Toolbar Icon Description Enter the first VU Enter the first VU iteration Check to compare two waterfalls Enter the second VU Enter the second VU iteration Refresh the waterfall charts Check to enable Zoom/Scroll Synch on the left & right chart Swap The Charts Context Menu WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 66
  • 68. User Interface Reference v2 Icon Name Description Auto-Sync Check to enable Zoom/Scroll Synch on the left & right chart Diagonal Scrolling Check to enable Diagonal Scrolling on this chart Un-Zoom Click to fully Un-Zoom this chart Copy Image Save Image As... Print Graph... Help Boxes Waterfall View - Select a VU and iteration. - To compare two waterfalls, check Compare and select a second VU and iteration. - Click Refresh to refresh the charts. See Also: Waterfall View Single Waterfall Chart Dual Waterfall Chart Waterfall Chart Commands - To swap the charts, click Swap. - To turn on/off synchronization of chart scrolling and zooming, click Sync/Un-sync. See Also: Waterfall View Single Waterfall Chart Dual Waterfall Chart 1.5.2.5 Query Log Help Boxes Query Test Log To display selected replayed sessions from the Test Log in the session grid, enter selection criteria and click "Show Sessions". Selection criteria formats and examples: WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 67
  • 69. User Interface Reference v2 - for VUs, Iterations and Sessions: 1-3, 5, 9; - for responses with Errors and/or Timeouts: check 1 or 2 boxes; - to filter by time range, check the box, select Send, Received, or both, and enter the time range in seconds; - for Test Cases and Agents: Name1, Name2; Note: - leaving textboxes empty will broaden the search. - retrieving more than 1,000 records, as entered in the Max Sessions box, can impact performance. See Also: Querying Test Log 1.5.3 Page and Transaction Result Tab Toolbar Button Action Description Summary Page/Transaction summary Performance Page/Transaction response time Latency Page/Transaction Latency/Server time breakdown Failures The number of failures on the Page/Transaction % Failures The percentage of failures on the Page/Transaction Requests Page/Transaction requests VU Activity VU Activity Chart Waterfall Waterfall Chart Show Sessions Show sessions matching selection criteria Back to the test result Help Boxes Page Result Views Click a button on the left to select one of the following page result views: Summary, Performance, Latency, Failures, Failures %, Requests, VU Activity, Waterfall. WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 68
  • 70. User Interface Reference v2 In the selected view, right-click for more options and help information. - To selected sessions from the Test Log, click "Show Sessions". - To go back to the Test Result, click Back. See Also: Page & Transaction Result Tabs Querying Test Log Transaction Result Views Click a button on the left to select one of the following page result views: Summary, Performance, Latency, Failures, Failures %, Requests, VU Activity, Waterfall. In the selected view, right-click for more options and help information. - To selected sessions from the Test Log, click "Show Sessions". - To go back to the Test Result, click Back. See Also: Page & Transaction Result Tabs Querying Test Log 1.5.3.1 Summary View Help Boxes Summary View Summary view lists page or transaction basic performance metrics and failures. It includes subsections that can be expanded / collapsed by clicking the triangle icon. See Also: Summary View 1.5.3.2 Performance View Help Boxes Performance View Performance view presents a page or a transaction response timeline and changes depending on the number of emulated VUs. It features five curves: the minimum, average and maximum response time, goal and the number of VUs. See Also: Performance View WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 69
  • 71. User Interface Reference v2 Graph Context Menu To show sessions related to this page or transaction, sent/received within a time range: 1.Select the time range to zoom it to a full graph 2.Click "Show sessions in range…" The sessions will be displayed in the session grid. To zoom-out one/all step, click Un-Zoom One/All. To show hidden curves, click "Unhide". Other commands: - Copy, Save, Print Graph Image - Export Graph datapoints For more options, right-click a curve. See Also: Graph Context Menu Graph Curve Context Menu To hide all but the selected curve, click "Hide". To unhide all curves, in the graph context menu, click "Unhide". To copy or export curve data, click Copy or Export. See Also: Graph Curve Context Menu 1.5.3.3 Latency View Help Boxes Latency View Latency view presents a page or a transaction response time breakdown between Latency and Server Time. The latency (or network time) is a portion of the response time attributed to the network delays, necessary for server responses to reach the client. See Also: Latency View 1.5.3.4 Failures View Help Boxes WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 70
  • 72. User Interface Reference v2 Failure View Failure views helps to analyze the number of page or transaction failures. The graph presents a timeline of errors, timeouts, missed goals and their changes depending on the number of emulated VUs. See Also: Failure View Failure % View Failure views helps to analyze the percentage of page or transaction failures. The graph presents a timeline of errors, timeouts, missed goals and their changes depending on the number of emulated VUs. See Also: Failure View 1.5.3.5 Requests View Help Boxes Request View The request grid displays aggregated performance characteristics of each request related to this page or transaction and grouped by URL. Time characteristics are averaged. Request counts are summed. If a request timed-out and subsequently failed, it's counted as a timeout. See Also: Request View 1.5.3.6 VU Activity View Activity Chart Context Menu Icon Name Description Un-Zoom Click to fully Un-Zoom this chart Copy Image Copy Image to the clipboard Same Image As... Save Graph as an Image Print Graph... Print Graph WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 71
  • 73. User Interface Reference v2 Iteration Bar Context Menu Icon Name Description View Waterfall Dbl-Click to display a waterfall for this VU/ iteration Compare Waterfalls Ctrl+Dbl-Click to display this VU/ iteration in a dual waterfall on the right Help Boxes VU Activity View VU Activity View shows the activity of every VU during the test. Each row in the chart represents an individual VU. The row is broken down on differently colored horizontal bars, each of which represents single test iteration. The x-axis displays the time line for the load test run. See Also: Page & Transaction Result Tabs Activity Chart - Horizontal axle is a timeline. - Vertical axle shows VUs. - Horizontal bars, represent a test iteration executed by a VU. - To zoom-in to a specific VUs/Iterations range, select an appropriate rectangular area. - To zoom-out, right-click and select Un-Zoom. For more options, right-click a horizontal bar. Other context menu commands: - Copy, Save, Print Graph Image See Also: Graph Context Menu Activity Chart Context Menu Horizontal bars, represent a test iteration executed by a VU. - To display a waterfall of a selected iteration, click View Waterfall. - To compare a waterfall of the selected Iteration with previously select waterfall, click Compare Waterfalls. WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 72
  • 74. User Interface Reference v2 See Also: Graph Context Menu 1.5.3.7 Waterfall View Toolbar Icon Description Enter the first VU Enter the first VU iteration Check to compare two waterfalls Enter the second VU Enter the second VU iteration Refresh the waterfall charts Swap The Charts Check to enable Zoom/Scroll Synch on the left & right chart Navigate to VU Activity chart Context Menu Icon Name Description Auto-Sync Check to enable Zoom/Scroll Synch on the left & right chart Diagonal Scrolling Check to enable Diagonal Scrolling on this chart Un-Zoom Click to fully Un-Zoom this chart Copy Image Save Image As... WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 73
  • 75. User Interface Reference v2 Icon Name Description Print Graph... Help Boxes Waterfall View - Select a VU and iteration. - To compare two waterfalls, check Compare and select a second VU and iteration. - Click Refresh to refresh the charts. See Also: Waterfall View Single Waterfall Chart Dual Waterfall Chart Waterfall Chart Commands - To swap the charts, click Swap. - To turn on/off synchronization of chart scrolling and zooming, click Sync/Un-sync. See Also: Waterfall View Single Waterfall Chart Dual Waterfall Chart 1.5.4 Comparing-Tests Help Boxes Compare multiple tests Click a button on the left to select Summary or KPI Graph view. In the selected view, right-click for more options and help information. See Also: Test Comparison Summary View KPI Graph Comparison View Toolbar WORKFLOW TREE AND FUNCTIONAL AREA - ANALYZE RESULTS 74
  • 76. User Interface Reference v2 Button Action Description Summary Test Comparison Summary KPI Graph KPI Graph Comparison 1.6 Workflow Tree Toolbar Toolbar Back - Return one step back Tree View - Toggle to display the Test Case Tree on the left pane Grid View - Toggle to display the Session Grid on the left pane Show recorded Test Case sessions in the session grid Test Wizard Run - Start a load test Help Boxes Workflow Tree Toolbar - Click "Back" to go back one step on the Workflow Tree. - Click "Tree View" to display the Test Case Tree on the left pane. - Click "Grid View" to display the session Grid on the left pane. - Click "Show Recorded" to display Test Case sessions in the session grid. Test Wizard / Run Test - Click "Test Wizard". The wizard will guide you through the major steps of creating, configuring and running a test. WORKFLOW TREE AND FUNCTIONAL AREA - WORKFLOW TREE TOOLBAR 75
  • 77. User Interface Reference v2 - Click "Run" to start the test. Graphs will display the test results in progress. After the test completion, select reports from the Analyze Results section in the Workflow Tree. See Also: Starting Test 1.7 Test Wizard 1.7.1 Record Test Case 1.7.2 Configure Test Case 1.7.3 Configure Test 1.7.4 Run Test 1.7.5 Analyze Results 1.7.6 Record Test Case Help Boxes Create a Test Case To record a Test Case by navigating through your application, select the recording source and click "Record". See Also: Recording a test case Browser Recording Settings - Enter the initial URL and select a browser cache option. - In Private Mode (recommended), browser cache is not used. - Enter the first transaction name (optional) and click Record See Also: Recording with Web Browser WORKFLOW TREE AND FUNCTIONAL AREA - TEST WIZARD 76
  • 78. User Interface Reference v2 1.7.7 Configure Test Case 1.7.7.1 Targeted Hosts Icon Description Delete the requests to the selected hosts Add the selected hosts to the Excluded Hosts list and delete requests to these hosts Show the Excluded Hosts list Select All Delete the sessions with the selected content types Test Case Hosts This list displays hosts targeted in this Test Case. Toolbars commands: - Delete requests to the selected hosts from the Test Case. - Add the selected hosts to the Excluded Hosts list. Requests to these hosts will be ignored in future recordings. - Show the Excluded Hosts list. See Also: Purging requests to unwanted hosts 1.7.7.2 Content-Types Icon Description Delete the sessions with the selected content types Add the selected content types to the Excluded Content Types list and delete sessions with these content types Show the Excluded Content Types list Select All Delete the sessions with the selected content types Test Case Content Types This list displays content types used in this Test Case. WORKFLOW TREE AND FUNCTIONAL AREA - TEST WIZARD 77
  • 79. User Interface Reference v2 Toolbars commands: - Delete sessions with the selected content types from the Test Case. - Add the selected content types to the Excluded Content Types list. Sessions with these content types will be ignored in future recordings. - Show the Excluded Content Types list. See Also: Purging sessions with the unwanted content types 1.7.7.3 Autocorrelation AutoCorrelation Autocorrelation is the automatic modification of requests issued during a test run to replace recorded values with corresponding dynamic values received from the server in the previous responses. Autocorrelation is necessary to preserve application integrity in dynamic websites and avoid server errors. The wizard will now find and configure hidden autocorrelation parameters. See Also: AutoCorrelation 1.7.8 Configure Test Help Boxes Load Pattern Load pattern defines dynamics of virtual users (VU) throughout the test. See Also: Load Pattern Test Duration Set the test completion criteria. After reaching this condition, the test will stop. See Also: Test Duration WORKFLOW TREE AND FUNCTIONAL AREA - TEST WIZARD 78
  • 80. User Interface Reference v2 1.7.9 Run Test Help Boxes Run Test The wizard will now start the test execution. For more information, see Running and Monitoring Test 1.7.10 Analyze Results Help Boxes Analyze Results The wizard will now navigate through the main test results. For more information, see Analyzing Results WORKFLOW TREE AND FUNCTIONAL AREA - TEST WIZARD 79
  • 81. User Interface Reference v2 2 OBJECT AREA 2.1 Test Case Tree Page Context Menu Commands Icon Description Rename Page Edit Page Clone Page Delete Page Session Context Menu Commands Icon Description Show Session Inspector Edit Session Clone Session Delete Session Create Response Extractor Create Req. Un Parameter Create Req. Header Parameter Create Response Validator 2.1.1 Upper Toolbar Toolbar Icon Description OBJECT AREA - TEST CASE TREE 80
  • 82. User Interface Reference v2 Expand All Collapse All Edit the selected object Delete the selected object Test Case hosts Dock to Fiddler on the left Dock to StresStimulus on the right Help Boxes Test Case Modification To edit Tree objects or view more details: - Select an object and click "Edit". - Double-click a page to navigate to the Test Case Settings grid. - Double-click a request to display the Session Inspector. - Double-click an Extractor to navigate to the Extractors section. - Double-click a parameter to navigate to the Parameters section. - Double-click a validator to navigate the Validators section. - To delete an object, selected it in the Test Case Tree and click "Delete" or hit (Del) - To delete multiple sessions, select them in the session grid and hit (Ctrl+Del). - To add new sessions selected in the session grid, drag and drop them into the desire position in the Test Case Tree. - To reposition the selected in the Test Case Tree request or page, drag and drop it into a new position. Test Case Tree Commands - To show hosts targeted in this Test Case, click "Test Case hosts" - To dock the test case tree to Fiddler on the left or to StresStimulus on the right, click "Dock…" 2.1.1.1 Test Case Hosts Toolbar Button Action OBJECT AREA - TEST CASE TREE 81
  • 83. User Interface Reference v2 Delete the requests to the selected hosts Add the selected hosts to the Excluded Hosts list Show the Excluded Hosts list Help Boxes Test Case Hosts This list displays hosts targeted in this Test Case. Toolbars commands: - Delete requests to the selected hosts from the Test Case. - Add the selected hosts to the Excluded Hosts list. Requests to these hosts will be ignored in future recordings. - Show the Excluded Hosts list. See Also: Purging requests to unwanted hosts 2.1.2 Lower Toolbar Toolbar Button Action Find Next (F3) Find Previous (Shift+F3) Find Sessions by Content ... (Ctrl+F) Clear Search Delete the found highlighted sessions Filter Objects: Click button to show All object; Click drop-down to select sessions type. Filter Objects: Click button to show Sessions only; Click drop-down to select sessions type. Filter Objects drop-down options: Click to show recorded Primary requests OBJECT AREA - TEST CASE TREE 82
  • 84. User Interface Reference v2 Click to show all recorded requests, except images, stylesheets and scripts Click to show all recorded requests Click to show recorded requests with errors and warnings Hide autocorrelation parameter detail Show autocorrelation parameter details Help Boxes Session Search / Filter - To find a URL, start typing in the Search URLs box. - To find the next/previous URL, click "Find Next/Previous". - To find and highlight sessions by request/response content, click Find Sessions by Content or hit (Ctrl+F). - To clear session highlight, click "Clear Search". - To delete highlighted sessions, click "Delete highlighted". - To toggle between showing All Objects and Sessions Only, click "Filter Objects". - To select which sessions to display, click the "Filter Objects" drop-down. - To show or hide autocorrelation parameter details, click "Show" or "Hide" See Also: Searching Test Case Tree Filtering Test Case Tree 2.1.3 Session Inspector Toolbar Button Action Unlock for Editing Save session changes Split the window at 1/4 Split the window at 1/2 OBJECT AREA - TEST CASE TREE 83
  • 85. User Interface Reference v2 Split the window at 3/4 Help Boxes Session Inspector Session Inspector displays: - Request in the top text box - Response in the bottom text box To edit the session content, check "Unlock for Editing" box. 2.2 Session Grid Help Boxes Fiddler Grid To view test sessions in the Fiddler Grid, click the arrow on the "Show …" split-button on the toolbar above Workflow Tree and select which sessions to show. 1. "VU number" column displays a VU in the <User XXX> format. 2. Iterations and requests are displayed in the column "Iter-URL" as < YYY-ZZZ>, where YYY is an iteration number for the user XXX, and ZZZ is a request number within the iteration. 3. Replayed Sessions: Primary requests are displayed in bold-gray. Dependent requests are displayed in gray. 4. To delete selected recorded sessions from a test case, hit (Ctrl+Del). OBJECT AREA - SESSION GRID 84