Test designer maintains the test repository

Submitted by abe on Fri, 03/04/2022 - 16:14
Business phase
Answered user needs
Specification status
Draft
Implementation status
Planned
Description

As a test designer, I shall be able to maintain the test repository or a subset of it. This operation includes:

  • Defining new test cases;
  • Editing existing test cases, including status update;
  • Archiving test cases.

A test case shall have:

  • A name;
  • A keyword;
  • A version;
  • A summary;
  • A status (ready, in progress, deprecated, archived);
  • A type*;
  • A grading requirement**;
  • A review status (whether it has been reviewed and approved or not yet);
  • A description (shall support multiple languages);
  • The test roles involved in the test;
  • The list of test steps to execute.

In addition, the tool shall log the initial author of the test case as well as the date/time when it was modified for the last time and who did it.

* In the current version of the tool, in addition to the type, there is a "peer type" attribute which indicates whether it is a test for one single systems, for several systems (Peer-to-peer), or for a larger group of systems. The experience shows that we do not need this attribute anymore if we choose correct language for the test type attribute:

  • Conformance (ie any test with a tool, or manual inspection)
  • Prerequisite (ie do/read this first tests)
  • Interoperability (ie existing peer-to-peer and group tests)

** In the current version of the tool,

  • The Pre-Connectathon are not reviewed by a monitor; the SUT operator performs a self assessment and sets them as "Verified by vendor".
  • Among the Connectathon tests, there are tests that are prerequisites (read/do this first). The monitor is asked to mark them as "Verified" with no further verification, they could be simply marked as "Done" by the SUT operator itself.
  • The monitors can grade most of the Connectathon tests without asking the SUT operator to execute the test live. However, there are some tests that requires the monitor to sit with the SUT operator.

The grading requirement shall be valued:

  • Self assessment if the SUT operator is allowed to set the status by himself;
  • Realtime review if the grading requires the monitor to see the execution live;
  • Offline review if the grading can be performed mainly on the evidences collected in Gazelle.

Stories in feature

Title Specification status Status
Test designer marks a test as "Grading required" Planned