test-results

The test-results resource sends test results to the ALM Octane server.

This is a public synchronous resource. You must log in to the server using the Authentication API before using this resource.

Using this resource, you can:

URI

/test-results

Supported HTTP methods

This resource supports only the POST operation:

POST .../api/shared_spaces/<shared_space_id>/workspaces/1002/test-results

Back to top

The XML payload

The XML payload must conform to the XSD schema as described below.

Note: Before you start working with this API, retrieve the XSD schema of the payload using a GET operation: GET .../api/shared_spaces/<shared_space_id>/workspaces/1002/test-results/xsd

The payload describes:

The following table describes the XML elements to use in the payload for each entity type in ALM Octane:

To refer to this ALM Octane entity / attribute Use this ALM Octane entity name in REST Use this element/attribute in the payload XML

Release

release

release or release_ref

Story or Feature

story or feature

backlog_items or backlog_item_ref

Application module

product_area

product_areas or product_area_ref

Test

test

test_fields or test_field or test_field_ref

Environment

taxonomy_node

environment or taxonomy or taxonomy_ref

Automated run

run_automated

test_runs or test_run

Build

ci_build

build_name

CI server

ci_server

server_id

Build job

ci_job

job_name

Back to top

Parameters

This resource supports the following parameters: 

skip-errors Boolean

Whether parsing ignroes recoverable errors and continues to process data. The default is false.

  • true. Parsing ignores references to non-existing ALM Octane entities and any other recoverable errors. All valid data is processed.

  • false. When encountering a recoverable error within a test element, that element is skipped completely, and processing continues with the next element.

    For details on how conflicts between data at different levels are resolved when ignoring errors, see Resolving conflicts between per-report and per-test configuration.

Back to top

Payload definition

test_run element

The following table describes the test_run element attributes:

Attribute name Description

module

The name of the module or component that contains the test. For example, the name of the Maven module.

package

The location of the test inside the module. For example, the name of the Java package.

class

The name of the file that contains the test. For example, the name of the Java>

name

The name of the executed test. For example, the name of the Java method.

duration

The duration of the test execution, in milliseconds.

status

The final status of the test execution. Possible values: ‘Passed', 'Failed', 'Skipped'

started

(Optional) The starting time of the test execution.

This is the number of milliseconds since the standard base time known as "the epoch": January 1, 1970, 00:00:00 GMT.

external_report_url The URL to a report in the external system that produced this run.

In ALM Octane:

Note: When POSTing test and test run results, ALM Octane's ability to find a unique match determines if a new test or test run is created, or if an existing one is updated (simulating a PUT operation). For each reported test run:

Back to top

Test characteristics

test_fields element

The test_fields element is used to set the values of the following fields for new or updated tests or test runs: test level, type, testing tool, and framework.

These fields are list-based fields that correspond to ALM Octane list-based fields. This means you can select a value from a predefined list (or add values to that list).

field name field value cardinality Logical name Display name Predefined values

test-level

single

test_level

Test Level

Unit Test, System Test, Integration Test

type

multiple

test_type

Test Type

Acceptance, Regression, End to End, Sanity, Security, Performance

testing-tool

single

testing_tool_type

Testing Tool

UFT, LeanFT, StormRunner, Selenium, Manual

framework

single

je.framework

Framework

JUnit, UFT, TestNG

There are two ways to refer to the field value:

Back to top

Test Run context

You can specify the product / project context of a test by associating it with application modules, stories, and features.

product_areas element

If the payload contains the product_area_ref element, new or updated test entities are linked to the specified application modules. Application modules are listed by ID.

The specified application modules must exist in ALM Octane. Otherwise it is considered an error (unless you set the skip-errors parameter to true when POSTing the results).

backlog_items element

If the payload contains the backlog_item_ref element, new or updated test entities are linked to the specified stories or features. Stories and features are listed by ID.

The specified story or feature must already exist in ALM Octane. Otherwise it is considered an error (unless you set the skip- errors parameter to true when POSTing the results).

Back to top

Tests Execution Context

You can specify the context of a test run by associating it with a release, a build, a pipeline, and environment information .

Optionally, link a test run to a pipeline context using the build element. This is only possible if the pipeline and other relevant entities already exist in ALM Octane.

release and release_ref Elements

If the payload contains the release or releaseRef element, new or updated test entities are linked to the specified release.

environment, taxonomy, and taxonomy_ref Element

On ALM Octane, environment elements are translated into Environment labels. There are two ways to specify an environment:

build Element

You can use the optional build element to link a test run entities to an existing pipeline in ALM Octane as follows:

This provides a unique identification of the pipeline context which in turn identifies a pipeline in ALM Octane (the test report comes from a build that is produced by a build job which is part of a specific pipeline).

All specified items (CI server, build job, build) and the corresponding pipeline entity, as well as additional pipeline-related entities, must exist in ALM Octane. Otherwise, it is considered an error (unless you set the skip-errors parameter to true when POSTing).

Back to top

Resolving conflicts between per-report and per-test configuration

These are the rules regarding resolution of conflicts between the per-report and per-test configuration.

  1. If the following context elements are set both per report and per test, then the per-report and per-test information is merged for further processing: product_areas, backlog_items, taxonomy, fields (for characteristics that support multiple values).

  2. If skip-errors is false (default), then such conflicts are considered an error. If the conflict is at the report level, the processing stops. If it is in a single test, that test is skipped.

    For releases, this means:

    1. If the conflict is between the per-report release and pipeline context's release, then the processing stops.

    2. If the conflict is between the per-report release and per-test release or between the pipeline context's release and per-test release, then the test is skipped in processing.

If skip-errors is set to true:

Back to top

Sample Payloads

Sample 1

<?xml version='1.0' encoding='UTF-8'?>
<test_result>
   <build server_id="uuid" job_id="junit-job" job_name="junit-job" build_id="1" build_name="1"/>
   <release name="MyRelease"/>
   <backlog_items>
      <backlog_item_ref id="1011"/>
   </backlog_items>
   <product_areas>
      <product_area_ref id="1003"/>
   </product_areas>
   <test_fields>
      <test_field_ref id="1005"/>
   </test_fields>
   <environment>
      <taxonomy_ref id="1004"/>
   </environment>
   <test_runs>
      <test_run module="/helloWorld" package="hello" class="HelloWorldTest" name="testOne" duration="3"
        status="Passed" started="1430919295889">
         <release_ref id="1004"/>
      </test_run>
      <test_run module="/helloWorld" package="hello" class="HelloWorldTest" name="testTwo" duration="2"
        status="Failed" started="1430919316223">
         <product_areas>
            <product_area_ref id="1007"/>
            <product_area_ref id="1008"/>
         </product_areas>
      </test_run>
      <test_run module="/helloWorld" package="hello" class="HelloWorldTest" name="testThree" duration="4"
        status="Skipped" started="1430919319624">
         <test_fields>
            <test_field_ref id="1006"/>
            <test_field type="hp.qc.test-new-type" value="hp.qc.test-new-type.acceptance"/>
         </test_fields>
      </test_run>
      <test_run module="/helloWorld2" package="hello" class="HelloWorld2Test" name="testOnce" duration="2"
        status="Passed" started="1430919322988">
         <environment>
            <taxonomy_ref id="1008"/>
            <taxonomy type="OS" value="Linux"/>
         </environment>
      </test_run>
      <test_run module="/helloWorld2" package="hello" class="HelloWorld2Test" name="testDoce" duration="3" 
         status="Passed" started="1430919326351">
         <backlog_items>
            <backlog_item_ref id="1012"/>
         </backlog_items>
      </test_run>
   </test_runs>
</test_result>

Back to top

Sample 2

<?xml version='1.0' encoding='UTF-8'?>
<testResult>
   <build server_id="7bdf7c06-1017-11e5-9493-1697f925ec7b" job_name="acceptanceTestsJob" build_name="35"/>
   <tests>
      <test module="integration-tests" package="com.hp.integration.tests" />
   </tests>
 </testResult>      

Back to top

Response

test-results returns a job ID which can later be queried for the test run creation status. For a list of possible return statuses, see Return Status.

The test report is streamed to ALM Octane and validated by the parser. After reading a batch of data these are processed and stored.

The following errors may occur:

Non-recoverable error

The test report is not a well formed XML document.

If a non-recoverable error occurs, and the parser is unable to continue with reading the stream, the processing of the test report stops.

Note: Valid data processed before the error may have already been sent to ALM Octane and saved there.

Recoverable error
  • Schema mismatch: Some required elements or attributes are missing, misplaced, or do not have the required type.

  • The test report references entities that do not exist in ALM Octane. A text string is returned containing a list of all encountered errors.

If a recoverable error occurs (such as a missing attribute, an invalid reference, and so on, processing continues.

  • If the skip-errors query parameter is set to false (default), the test element containing the error is not used, but all valid tests are processed.

  • If you set the skip-errors query parameter to true, processing includes all usable data from the test report, skipping only the problematic data.

    For details, see skip-errors.

Back to top

Return Status

Use the job ID in the test-results response to query for the test run creation status.

Code Status Description
201 Created

Returned if:

  • No error was encountered.

  • The skip-errors query parameter was set to true and the test report was partially processed (at least one test was processed while parts with errors were skipped).

409 Conflict

Returned if:

  • The test report was not processed at all. This means one of the following:

    • The per-report context section at the beginning of the test report (the part before the tests element) is not a well-formed XML.

    • The skip-errors query parameter is set to false and the per-report context section at the beginning of the test report does not conform to the XSD schema (for example, missing attributes) or refers to ALM Octane entities that do not exist.

  • The test report was processed only partially (meaning, at least a single test was successfully processed). This means there were some errors inside the tests element and the skip-errors query parameter was set to false.

Back to top

See also: