arjuna.tpi.engine.test module

arjuna.tpi.engine.test.skip(condition: bool, *, reason: str = None)

Builder for a Skip condition. Wraps pytest.mark.skip and pytest.mark.skipif

Parameters:condition – True/False. You can give a condition code evaluating to bool. You can give such code as a string as well. Or the string can be just a statement stating the reason.
Keyword Arguments:
 reason – Why this test is to be skipped.
arjuna.tpi.engine.test.test(f: Callable = None, *, id: str = None, resources: ListOrTuple = None, drive_with: DataSource = None, exclude_if: Relation = None, xfail: boolOrXFail = False, skip: boolOrSkip = False, priority: int = 5, author: str = None, idea: str = None, component: str = None, app_version: str = '0.0.0', level: str = None, reviewed: bool = False, unstable: bool = False, tags: set = {}, bugs: set = {}, envs: set = {}, **test_attrs)

Decorator for marking a function as a test function.

Parameters:

func – A Function with signature f(request). The name request is mandatory and enforced.

Keyword Arguments:
 
  • id – Alnum string representing an ID which you want to associate with the test.
  • resources – Fixtures/Resources that you want to associate this test with. Wraps pytest.mark.usefixtures. Instead of using this, you can also pass the names as direct arguments in the function signature.
  • drive_with

    Used for data driven testing. Argument can be Arjuna Data Source. Wraps pytest.mark.parametrize. If you use this argument, the test function signature must include a data argument e.g.

    @test(drive_with=<DS>)
    def check_sample(request, data):
        pass
    
  • exclude_if – Define exclusion condition. Argument can be an Arjuna Relation. Wraps pytest.mark.dependency.
  • xfail – Mark this test as a expected fail by setting to True. You can also use helper xfail() to create an advanced xfail construct. Wraps pytest.mark.xfail.
  • skip – Mark this test as a expected skipped by setting to True. You can also use helper skip() to create an advanced skip construct. Wraps pytest.mark.skip and Wraps pytest.mark.skipif.
  • priority – An integer value 1-5 depicting priority of this test, 1 being highest, 5 being lowest.
  • author – Author of this test
  • idea – The idea describing this test
  • component – Primary software component that this test targets.
  • app_version – Version of SUT that this test targets
  • level – Level of this test.
  • reviewed – Has this test been reviewed?
  • unstable – Is this test unstable?
  • tags – Set of tags for this test
  • bugs – Set of bugs associated with this test
  • envs – Set of Environment names on which this test is supposed to run.
  • **test_attrs – Arbitrary name-value pairs to provide further test attributes.

Note

The test function name must start with the prefix check_

The test function must have the minimum signature as check_<some_name>(request) with request as the first argument.

arjuna.tpi.engine.test.xfail(condition, *, reason: str = 'Expected Failure', raises: Exception = None, run: bool = True, strict: bool = False)

Builder for an Expected Failure condition and decisions. Directly wraps pytest.mark.xfail https://docs.pytest.org/en/latest/reference.html#pytest-mark-xfail

Parameters:

condition – True/False. You can give a condition code evaluating to bool. You can give such code as a string as well. The string can be just a statement stating the reason.

Keyword Arguments:
 
  • reason – Why this test is expected to fail. Mandatory if condition is a bool.
  • raises – Exception subclass expected to be raised by the test function (reported as XFailed); other exceptions will fail the test (reported as Failed).
  • run – Should this test be executed.
  • strict – Test Suite will not be marked as a fail if this is False and this test passes or fails. If True, if this test passes, the test suite is marked as fail.