Writing buildspecs

buildspec is your test recipe that buildtest processes to generate a test script. A buildspec can be composed of several test sections. The buildspec file is validated with the Global Schema and each test section is validated with a sub-schema defined by the type field.

Let’s start off with an example:

version: "1.0"
buildspecs:
  variables:
    type: script
    executor: local.bash
    vars:
      X: 1
      Y: 2
    run: echo "$X+$Y=" $(($X+$Y))

buildtest will validate the entire file with global.schema.json, the schema requires version and buildspec in order to validate file. The buildspec is where you define each test. In this example their is one test called variables. The test requires a type field which is the sub-schema used to validate the test section. In this example type: script informs buildtest to use the Script Schema when validating test section.

Each subschema has a list of field attributes that are supported, for example the fields: type, executor, vars and run are all valid fields supported by the script schema. The version field informs which version of subschema to use. Currently all sub-schemas are at version 1.0 where buildtest will validate with a schema script-v1.0.schema.json. In future, we can support multiple versions of subschema for backwards compatibility.

The executor key is required for all sub-schemas which instructs buildtest which executor to use when running the test. The executors are defined in your buildtest settings in Configuring buildtest.

In this example we define variables using the vars section which is a Key/Value pair for variable assignment. The run section is required for script schema which defines the content of the test script.

Let’s look at a more interesting example, shown below is a multi line run example using the script schema with test name called systemd_default_target, shown below is the content of test:

version: "1.0"
buildspecs:
  systemd_default_target:
    executor: local.bash
    type: script
    description: check if default target is multi-user.target
    run: |
      if [ "multi-user.target" == `systemctl get-default` ]; then
        echo "multi-user is the default target";
        exit 0
      fi
      echo "multi-user is not the default target";
      exit 1
    status:
      returncode: 0

The test name systemd_default_target defined in buildspec section is validated with the following pattern "^[A-Za-z_][A-Za-z0-9_]*$". This test will use the executor local.bash which means it will use the Local Executor with an executor name bash defined in the buildtest settings. The default buildtest settings will provide a bash executor as follows:

executors:
  local:
    bash:
      description: submit jobs on local machine using bash shell
      shell: bash

The shell: bash indicates this executor will use bash to run the test scripts. To reference this executor use the format <type>.<name> in this case local.bash refers to bash executor.

The description field is an optional key that can be used to provide a brief summary of the test. In this example we can a full multi-line run section, this is achieved in YAML using run: | followed by content of run section tab indented 2 spaces.

In this example we introduce a new field status that is used for controlling how buildtest will mark test state. By default, a returncode of 0 is PASS and non-zero is a FAIL. Currently buildtest reports only two states: PASS, FAIL. In this example, buildtest will match the actual returncode with one defined in key returncode in the status section.

Script Schema

The script schema is used for writing simple scripts (bash, sh, python) in Buildspec. To use this schema you must set type: script. The run field is responsible for writing the content of test. The Production Schema and Development Schema are kept in sync where any schema development is done in Development Schema followed by merge to Production Schema.

Return Code Matching

In this next example we will illustrate the concept of returncode match with different exit codes. In this example we have three tests called exit1_fail, exit1_pass and returncode_mismatch. All test are using the local.sh executor which is using sh to run the test. We expect exit1_fail and returncode_mismatch to FAIL while exit1_pass will PASS since returncode matches

version: "1.0"
buildspecs:

  exit1_fail:
    executor: local.sh
    type: script
    description: exit 1 by default is FAIL
    run: exit 1

  exit1_pass:
    executor: local.sh
    type: script
    description: report exit 1 as PASS
    run: exit 1
    status:
      returncode: 1

  returncode_mismatch:
    executor: local.sh
    type: script
    description: exit 2 failed since it failed to match returncode 1
    run: exit 2
    status:
      returncode: 1

To demonstrate we will build this test and pay close attention to the Status field in output

$ buildtest build -b tutorials/pass_returncode.yml 
Paths:
__________
Test Directory: /Users/siddiq90/Documents/buildtest/var/tests

+-------------------------------+
| Stage: Discovered Buildspecs  |
+-------------------------------+ 

/Users/siddiq90/Documents/buildtest/tutorials/pass_returncode.yml

+---------------------------+
| Stage: Parsing Buildspecs |
+---------------------------+ 

 schemafile              | validstate   | buildspec
-------------------------+--------------+-------------------------------------------------------------------
 script-v1.0.schema.json | True         | /Users/siddiq90/Documents/buildtest/tutorials/pass_returncode.yml

+----------------------+
| Stage: Building Test |
+----------------------+ 

 Name                | Schema File             | Test Path                                                                                              | Buildspec
---------------------+-------------------------+--------------------------------------------------------------------------------------------------------+-------------------------------------------------------------------
 exit1_fail          | script-v1.0.schema.json | /Users/siddiq90/Documents/buildtest/var/tests/local.sh/pass_returncode/exit1_fail/generate.sh          | /Users/siddiq90/Documents/buildtest/tutorials/pass_returncode.yml
 exit1_pass          | script-v1.0.schema.json | /Users/siddiq90/Documents/buildtest/var/tests/local.sh/pass_returncode/exit1_pass/generate.sh          | /Users/siddiq90/Documents/buildtest/tutorials/pass_returncode.yml
 returncode_mismatch | script-v1.0.schema.json | /Users/siddiq90/Documents/buildtest/var/tests/local.sh/pass_returncode/returncode_mismatch/generate.sh | /Users/siddiq90/Documents/buildtest/tutorials/pass_returncode.yml

+----------------------+
| Stage: Running Test  |
+----------------------+ 

 name                | executor   | status   |   returncode | testpath
---------------------+------------+----------+--------------+--------------------------------------------------------------------------------------------------------
 exit1_fail          | local.sh   | FAIL     |            1 | /Users/siddiq90/Documents/buildtest/var/tests/local.sh/pass_returncode/exit1_fail/generate.sh
 exit1_pass          | local.sh   | PASS     |            1 | /Users/siddiq90/Documents/buildtest/var/tests/local.sh/pass_returncode/exit1_pass/generate.sh
 returncode_mismatch | local.sh   | FAIL     |            2 | /Users/siddiq90/Documents/buildtest/var/tests/local.sh/pass_returncode/returncode_mismatch/generate.sh

+----------------------+
| Stage: Test Summary  |
+----------------------+ 

Executed 3 tests
Passed Tests: 1/3 Percentage: 33.333%
Failed Tests: 2/3 Percentage: 66.667%

Python example

You can use script schema to write python scripts using the run section. This can be achieved if you use the local.python executor assuming you have this defined in your buildtest configuration.

Here is a python example calculating area of circle:

version: "1.0"
buildspecs:
  circle_area:
    executor: local.python
    type: script
    shell: python
    description: "Calculate circle of area given a radius"
    tags: ["python"]
    run: |
      import math
      radius = 2
      area = math.pi * radius * radius
      print("Circle Radius ", radius)
      print("Area of circle ", area)

The shell: python will let us write python script in the run section. The tags field can be used to classify test, the field expects an array of string items.

Note

Python scripts are very picky when it comes to formatting, in the run section if you are defining multiline python script you must remember to use 2 space indent to register multiline string. buildtest will extract the content from run section and inject in your test script. To ensure proper formatting for a more complex python script you may be better of writing a python script in separate file and call it in run section.

Skipping test

By default, buildtest will run all tests defined in buildspecs section, if you want to skip a test use the skip: field which expects a boolean value. Shown below is an example test:

version: "1.0"
buildspecs:
  skip:
    type: script
    executor: local.bash
    skip: true
    run: hostname

  unskipped:
    type: script
    executor: local.bash
    skip: false
    run: hostname

The first test skip will be skipped by buildtest because skip: true is defined.

Note

YAML and JSON have different representation for boolean. For json schema valid values are true and false see https://json-schema.org/understanding-json-schema/reference/boolean.html however YAML has many more representation for boolean see https://yaml.org/type/bool.html. You may use any of the YAML boolean, however it’s best to stick with json schema values true and false.

Here is an example build, notice message [skip] test is skipped during the build stage

$ buildtest build -b tutorials/skip_tests.yml 
Paths:
__________
Test Directory: /Users/siddiq90/Documents/buildtest/var/tests

+-------------------------------+
| Stage: Discovered Buildspecs  |
+-------------------------------+ 

/Users/siddiq90/Documents/buildtest/tutorials/skip_tests.yml

+---------------------------+
| Stage: Parsing Buildspecs |
+---------------------------+ 

[skip] test is skipped.
 schemafile              | validstate   | buildspec
-------------------------+--------------+--------------------------------------------------------------
 script-v1.0.schema.json | True         | /Users/siddiq90/Documents/buildtest/tutorials/skip_tests.yml

+----------------------+
| Stage: Building Test |
+----------------------+ 

 Name      | Schema File             | Test Path                                                                                 | Buildspec
-----------+-------------------------+-------------------------------------------------------------------------------------------+--------------------------------------------------------------
 unskipped | script-v1.0.schema.json | /Users/siddiq90/Documents/buildtest/var/tests/local.bash/skip_tests/unskipped/generate.sh | /Users/siddiq90/Documents/buildtest/tutorials/skip_tests.yml

+----------------------+
| Stage: Running Test  |
+----------------------+ 

 name      | executor   | status   |   returncode | testpath
-----------+------------+----------+--------------+-------------------------------------------------------------------------------------------
 unskipped | local.bash | PASS     |            0 | /Users/siddiq90/Documents/buildtest/var/tests/local.bash/skip_tests/unskipped/generate.sh

+----------------------+
| Stage: Test Summary  |
+----------------------+ 

Executed 1 tests
Passed Tests: 1/1 Percentage: 100.000%
Failed Tests: 0/1 Percentage: 0.000%

Script Schema and Examples

buildtest command line interface provides access to schemas and example buildspecs for each schemas.

To retrieve the full json schema of script schema you can run the following:

buildtest schema -n script-v1.0.schema.json --json

The example buildspecs are validated with the schema, so they are self-documentating examples. For example, you can retrieve the script examples using buildtest schema -n script-v1.0.schema.json -e. Shown below we show valid and invalid examples. The examples are validated with the schema script-v1.0.schema.json.

$ buildtest schema -n script-v1.0.schema.json -e 



File: /Users/siddiq90/Documents/buildtest/buildtest/schemas/examples/script-v1.0.schema.json/valid/examples.yml
Valid State: True
________________________________________________________________________________


version: "1.0"
buildspecs:
  multiline_run:
    executor: local.bash
    type: script
    description: multiline run command
    run: |
      echo "1"
      echo "2"

  single_command_run:
    executor: local.bash
    type: script
    description: single command as a string for run command
    run: "hostname"

  declare_env:
    executor: local.bash
    type: script
    description: declaring environment variables
    env:
      FOO: BAR
      X: 1
    run: |
      echo $FOO
      echo $X

  declare_vars:
    executor: local.bash
    type: script
    description: declaring variables
    vars:
      First: Bob
      Last:  Bill
    run: |
      echo "First:" $First
      echo "Last:" $Last


  declare_shell_sh:
    executor: local.sh
    type: script
    description: declare shell name to sh
    shell: sh
    run: hostname

  declare_shell_bash:
    executor: local.bash
    type: script
    description: declare shell name to bash
    shell: bash
    run: hostname

  declare_shell_python:
    executor: local.python
    type: script
    description: declare shell name to python
    shell: python
    run: |
      print("Hello World")

  declare_shell_bin_bash:
    executor: local.bash
    type: script
    description: declare shell name to /bin/bash
    shell: "/bin/bash -e"
    run: hostname

  declare_shell_name_bin_sh:
    executor: local.script
    type: script
    description: declare shell name to /bin/sh
    shell: "/bin/sh -e"
    run: hostname


  declare_shell_opts:
    executor: local.sh
    type: script
    description: declare shell name to sh
    shell: "sh -e"
    run: hostname

  declare_shebang:
    executor: local.bash
    type: script
    description: declare shell name to sh
    shebang: "#!/usr/bin/env bash"
    run: hostname

  status_returncode:
    executor: local.bash
    type: script
    description: This test pass because using a valid return code
    run: hostname
    status:
      returncode: 0

  status_regex:
    executor: local.bash
    type: script
    description: This test pass with a regular expression status check
    run: hostname
    status:
      regex:
        stream: stdout
        exp: "^$"

  status_regex_returncode:
    executor: local.bash
    type: script
    description: This test fails because returncode and regex specified
    run: hostname
    status:
      returncode: 0
      regex:
        stream: stdout
        exp: "^hello"

  sbatch_example:
    type: script
    executor: local.bash
    description: This test pass sbatch options in test.
    sbatch:
      - "-t 10:00:00"
      - "-p normal"
      - "-N 1"
      - "-n 8"
    run: hostname

  skip_example:
    type: script
    executor: local.bash
    description: this test is skip
    skip: true
    run: hostname

  tag_example:
    type: script
    executor: local.bash
    description: This is a tag example
    sbatch:
      - "-t 10:00:00"
      - "-p normal"
      - "-N 1"
      - "-n 8"
    tags: [slurm]
    run: hostname




File: /Users/siddiq90/Documents/buildtest/buildtest/schemas/examples/script-v1.0.schema.json/invalid/examples.yml
Valid State: FAIL
________________________________________________________________________________


version: "1.0"
buildspecs:
  invalid_test_name_&!@#$%:
    type: script
    executor: local.bash
    description: "invalid test name"

  invalid_bash:
    type: script
    executor: local.bash
    shell: "bash-missing-run"

  missing_run_key:
    type: script
    executor: local.bash
    description: invalid key name roon, missing run key
    roon: |
        systemctl is-active slurmd
        systemctl is-enabled slurmd | grep enabled

  invalid_env_type:
    type: script
    executor: local.bash
    description: env key should be a dictionary
    env:
      - FOO=BAR
    run: echo $FOO

  invalid_vars_type:
    type: script
    executor: local.bash
    description: var key should be a dictionary
    vars:
      - FOO=BAR
    run: echo $FOO


  invalid_description:
    type: script
    executor: local.bash
    description:
      - "Multi Line description"
      - "is not accepted"

  invalid_regex_stream:
    type: script
    executor: local.bash
    description: This test fails because of invalid regex stream
    run: hostname
    status:
      regex:
        stream: file
        exp: "world$"

  missing_regex_exp:
    type: script
    executor: local.bash
    description: This test fails because of missing key 'exp' in regex
    run: hostname
    status:
      regex:
        stream: stdout

  invalid_returncode_type:
    type: script
    executor: local.bash
    description: This test fails because of invalid return code type
    run: hostname
    status:
      returncode: ["1"]

  invalid_shell_usr_bin_bash:
    type: script
    executor: local.bash
    description: invalid shell name, since we only support 'sh', 'bash', 'python' '/bin/bash' /bin/sh
    shell: /usr/bin/bash
    run: hostname

  invalid_shell_type:
    type: script
    executor: local.bash
    description: invalid shell type must be a string
    shell: ["/bin/bash"]
    run: hostname

  invalid_type_shell_shebang:
    type: script
    executor: local.bash
    description: invalid type for shell shebang, must be a string
    shebang: ["#!/bin/bash"]
    run: hostname

  invalid_skip_value:
    type: script
    executor: local.bash
    description: invalid value for skip, must be boolean
    skip: 1
    run: hostname

  invalid_tags_value:
    type: script
    executor: local.bash
    description: invalid tag value must be all string items
    tags: ["compiler", 400 ]
    run: hostname

  additionalProperties_test:
    type: script
    executor: local.bash
    description: additional properties are not allowed so any invalid key/value pair will result in error
    FOO: BAR
    run: hostname

________________________________________ Validation Error ________________________________________
'invalid_test_name_&!@#$%' does not match '^[A-Za-z_][A-Za-z0-9_]*$'

Failed validating 'pattern' in schema['properties']['buildspecs']['propertyNames']:
    {'pattern': '^[A-Za-z_][A-Za-z0-9_]*$'}

On instance['buildspecs']:
    'invalid_test_name_&!@#$%'