Set up Parameterized testing

Generally, you want to perform the following operations in the order described, though there may be some concurrency, and some steps are optional, as noted.

  1. (Optional) Define the Test Parameters Library. (Project Administrator)

  2. (Optional) Define default Test Parameters in Test Run Templates. (Testing Manager or Lead)

  3. Insert Test Parameters in the text of Test Steps in Work Items of the Test Case type. This step may take place concurrently with the next step. (Test Specification Author)

  4. (Optional) Create Test Runs and define Test Run Parameters in the Properties. This step may take place concurrently with Test Parameter specification in Test Cases. (Testing Manager or Lead)

  5. In the Test Runs, provide actual Test Parameter values, optionally adding Iterations, for the Test Cases selected in the Test Run configuration. (Testing Manager or Lead)

  6. Execute Test Cases following the Test Steps through all Iterations. (Tester)

Define the Test Parameter Library

When composing Test Cases, test specification authors who have the requisite permissions can optionally save Test Parameters they create to the project's Test Parameters Library. For example, if a common test environment is a web browser, it would make sense to add a parameter named Browser to the library. The parameters saved to the library appear as items in the Insert Test Parameter select list in the Test Steps table of Work Items of Test Case type, when a Test Case author is editing the table and defining Test Steps. It is also shown in the Parameter Name column of the Manage Test Parameters dialog box, accessible in Test Runs and Test Run Templates.

Administer the Test Parameter Library:

An administrator can access this library in the project Administration and perform several operations including defining new Test Parameters and modifying or deleting existing parameters. In the Administration configuration, you can only specify Test Parameter name and type. Parameter values are specified by users who create Test Runs.

Tip:

In first releases of the parameterized testing feature, only the String type is supported.

Access and manage test parameters in the library:

  1. Open the project while logged on with administrator permissions and enter Administration.

  2. Expand the Testing topic and select Test Parameters Library.

Note:

Users must be granted repository access permissions for the test-parameters-library.xml file in the project repository in order to add parameters to the Test Parameters Library. Writing to that file is not granted by default for nonadministrator users.

The path in the access management client is: [PROJECT_FOLDER]/.polarion/testing/configuration/test-parameters-library.xml

Define Test Parameters in a Test Run template

You can configure Test Run Templates with Test Parameters so that when a Test Run is instantiated from a template, it contains the parameters defined there. In essence, you define the default Test Parameters for all Test Runs based on the Template. You can add Test Run Parameters in both new and existing Test Run Templates. You can optionally provide default values for the parameters you define in a Test Run Template. The values appear in Test Runs instantiated from the template.

In a Test Run instantiated from a template, users with the necessary permissions can:

  • Remove default parameters provided by the template.

  • Change any default parameter values provided by the template.

  • Define new parameters and values in the Test Run. These do not affect the template.

To work with Test Run Parameters in Test Run Templates, you must have permission to MODIFY Test Runs, and permission to DEFINE TEST PARAMETERS in Test Runs.

  1. Open the project, and in Navigation select Test Runs.

  2. In the table of existing Test Runs, click Manage Templates. The table now lists all currently defined Test Run Templates.

  3. In the table of Test Run Templates, select the Test Run Template you want to configure.

  4. In the detail of the selected template, click Properties and scroll down to the Test Parameters section. That section displays a table of currently configured parameters.

  5. Click the section header, and then click Manage.

  6. In the Manage Test Parameters dialog box, click the icon to add a new Test Parameter. Select from the list of parameters currently in the Test Parameters Library or enter the parameter name and type (if multiple types are available). Click OK when finished adding Test Parameters.

The table in the Test Parameters section now contains the parameters you added. You can optionally enter a value in the Value column for any or all parameters. This value appears in Test Runs instantiated from the Template. When finished, click Save to preserve changes, and then click Back to exit from Properties. If you want to configure another Test Run Template, select it in the table and repeat the foregoing procedures.

When finished with all templates, click Manage Templates to switch it off and return to the Test Runs management page.

Define Test Parameters and Values in Test Runs

This section applies to anyone who plans tests, creating and configuring Test Runs. When you create a new Test Run, you can define Test Parameters and their respective values. In the Test Cases selected for the Test Run, if the Test Steps contain a parameter of the same name as one defined in the Test Run, the Test Run supplies the value when the Test Cases are executed by testers. The Test Run author/planner must supply values for Test Parameters in Test Cases that do not have an equivalent parameter defined in the Test Run. Alternatively, testers can be given the necessary permission to modify parameters when executing the Test Cases of a Test Run.

For example, Test Case TC-1 might contain a step Open Browser where "Browser" is a Test Parameter. A Test Run might then define a parameter Browser with a value Firefox 40. When TC-1 is selected for the Test Run, the tester sees the step as Open Firefox 40 - the value "Firefox" is supplied by the Test Run during execution.

If TC-1 contains a step Enter Password in password field where Password is a Test Parameter, and an equivalent parameter is not defined in the Test Run, then the Test Run author/planner must provide a value (such as Pass999) when configuring the Test Run, so the tester will see Enter Pass999 in password field.

To specify Test Parameters, you should be familiar with the Test Cases or the needs of Test Case authors who are writing the Test Cases and Test Steps that will be tested when executing the Test Run. For example, if some Test Cases need to be executed in different web browsers, you can understand that it's useful for the Test Run to have a parameter named Browser.

Define Test Parameters in a Test Run:

  1. In your project, navigate into the Test Runs topic.

  2. On the Test Runs page, select the Test Run you want to configure or create a new Test Run.

  3. In the detail of the selected Test Run, click Properties and scroll down to the Test Parameters section. This section contains a table of currently configured Test Parameters. In a new Test Run, any parameters configured in the Test Run Template on which the Test Run is based appear in the table.

  4. Click the Test Parameters header, and then click Manage to open the Manage Test Parameters dialog box.

  5. Click Select Parameter. The list contains Test Parameters in the Test Parameters Library that are not currently used in the Test Run.

  6. If you want to add one of the library parameters to the Test Run, select it in the list, and select the type (if multiple types are listed).

    If you want to define a new parameter in this Test Run, select Add New in the list. In the Add New Parameter dialog box, enter the parameter name and specify the type (if multiple types are available). If you want the new parameter to be available project wide, select the Add to Library option (enabled only if you have permissions to add to the library).

  7. After adding all Test Parameters, if you want to provide values for any of them, provide them in the Value field of the respective parameters.

  8. Click Save to preserve changes, and then click Back to exit from Properties.

Your next task is to open the Test Cases selected for your Test Run and provide values for any Test Parameters they contain for which there is no equivalent Test Parameter in the Test Run. For example, if some Test Case has a parameter User that is not also configured in the Test Run, you need to provide a value for it at the Test Case level.

Supply values for Test Parameters in Test Cases:

  1. Select the Test Run on the Test Runs page and make sure Test Cases are already selected. (Selection mode must be Manual, By query on create or From LiveDoc on create.)

  2. With the Test Run selected in the table, click (Actions) and choose Select Test Cases. A new browser tab opens displaying the Work Items table, which lists the selected Test Cases. The Test Run Planning sidebar also opens in the view displaying information about the number of waiting executions and other Test Run statistics.

  3. For each of the Test Cases, in the sidebar panel labeled Iteration 1, enter values for all empty parameter fields.

  4. Click Save Test Run to update the Test Run.

By default, each Test Case has one Iteration. If a Test Case needs to be executed multiple times with different parameter values, click Add Iteration. A sequentially numbered Iteration panel is added to the sidebar, with parameter values prefilled from the previous Iteration. Modify the parameter values as needed so that testers process the steps correctly. Add additional Iterations until all necessary variations of the Test Case are covered. Be sure to save changes using Save Test Run.

When all Test Cases of the Test Run have values, you can close the browser tab and return to the Test Run, which is now ready for testers to execute unless you wish to make other changes.

Use Test Parameters in Test Steps

This section applies mainly to authors of Test Cases. The Test Parameters feature (sometimes called "parameterized testing") enables you to write abstract test procedures, inserting Test Parameters in place of some concrete specifications. When selected in a Test Run, a test planner, manager, or a tester with the requisite permissions can supply concrete values for the parameters in the Test Steps. These values become part of the automatically created test records.

For example, in a Test Case that describes a test that must be run on different mobile operating systems, instead of writing "Android", "iOS", or "Windows Mobile" in a Test Step, you could insert a parameter named OS. When planning the Test Case to a Test Run, a test manager can then supply a concrete value in place of OS, depending on which platform is being tested.

The Test Parameters can be defined in any or all of several different places:

  • In the project's Test Parameters Library in project Administration

  • In Test Run Templates

  • In individual Test Runs

  • In Test Steps of Test Cases as they are being written

When writing a Test Step in a Document or in the Work Items Table, you can insert any existing Test Parameter (already defined in the Test Case, or the Test Parameters Library) into your text. Or, you can define a new Test Parameter and insert that, optionally adding it to the library. You must have access permissions granted for the relevant folder in the Polarion repository in order to add Parameters to Library. Writing to this location is not granted by default for nonadministrator users. (See define the Test Parameters Library.) You must also have the Polarion user permission "ADD TO PARAMETERS LIBRARY".

The list includes parameters already defined the Test Case and/or Test Parameter library.

Insert a Test Parameter in a Test Step:

  1. Click the header of the Test Steps table in the Test Case. The table transitions to edit mode.

  2. Place the insertion cursor at the point in the table where you want to insert a Test Parameter. This is typically in the Step Description column.

  3. In the drop-down options list beneath the last row of the table, click Insert Test Parameter.

  4. Select one of the parameters in the drop-down list; you can filter the list by typing. Alternatively, create a new parameter by clicking the Add New item and entering properties for a new parameter in the Add New Parameter dialog box.

Provide parameter values in Test Cases

This section applies mainly to a testing manager or team leader creating and setting up Test Runs. During planning of a Test Run, the author/planner must supply values for any Test Parameters defined only on the Test Case level (that is with no equivalent parameter in the Test Run). For example, if the Test Case author defined a parameter OS, the Test Run author/planner must replace that with some concrete value, such as iOS or Android, if there is no parameter in the Test Run named OS.

When the Select Test Cases field of the Test Run is set to Manually, By Query on Create, or From LiveDoc on Create, you can click Waiting in the Test Run Status section of the Test Run to load the selected Test Cases in the Work Items table. Then you can begin supplying parameter values per the steps outlined below. When the Test Case selection mode is set to By Query on Execute or From LiveDoc on Execute, Test Parameter values cannot be specified at the Test Cases level. Values must be set in the Test Run, from which the Test Cases selected on execute will inherit the values. Also, multiple iterations per Test Case are not supported for the "on Execute" selection modes.

Provide Test Parameter values in the Test Cases of a Test Run:

  1. Select the Test Run in the Test Runs topic in your project.

  2. Click Waiting in the Test Run Status section of the Test Run. A new browser tab opens and displays the Work Items table, with the Test Cases of the Test Run listed and the Test Run Planning sidebar open.

  3. Select the first Test Case in the table, and in the Parameters section of the sidebar, specify values for any Test Parameters that do not yet have one. Any filled-in values are coming from the equivalent parameter in the Test Run. You can overwrite these values in the Test Case, if desired.

  4. If the Test Steps need to be executed multiple times with different parameter values, click Add Iteration and provide the necessary parameter values. Create any additional Iterations needed. See Defining the Iterations, below.

  5. Click Save Test Run in the sidebar to update the Test Run.

  6. Click the (Refresh) icon in the editor toolbar of the Test Case and check that all parameters are replaced with actual values in all Iterations in the Execute Tests section of the Test Case. Any parameters missing a value are highlighted in light red.

Repeat these steps for all other Test Cases of the Test Run listed in the table until you have provided Test Parameter values for all of them.

Define multiple Iterations of Text Steps in Work Items of the Test Case type:

  1. Be sure you have specified all missing values in Iteration 1 (always present and cannot be removed).

  2. Click Add Iteration in the sidebar to create another Iteration.

  3. Fill in values for all Test Parameters. Repeat this step to create as many Iterations as you need to run the Test Steps with all possible parameter value variations.

  4. Click Save Test Run in the sidebar to update the Test Steps.

  5. Click the (Refresh) icon in the editor toolbar of the Test Case.

You should now see all actual values for all Test Parameters in the Test Steps, highlighted in light blue. Any undefined parameters are highlighted in light red. You should see tabs on the left-hand side of the Test Steps, one for each Iteration you defined. You can select the tabs and check that the Test Parameters all show the correct values.

Fix any still-missing parameter value(s) in the Iteration(s) via the Test Run Planning sidebar, save the changes, and refresh the view to check again that all parameters show the correct values. Repeat these procedures for all Test Cases of the Test Run, selecting each on in the Work Items table in turn.

Warning:

Testers cannot execute a Test Case until all Test Parameters in all Iterations have a value. If execution cannot proceed for this reason, a dialog appears informing the tester about missing Test Parameter values.