spack.ci package

class spack.ci.PushResult(success, url)

Bases: tuple

success

Alias for field number 0

url

Alias for field number 1

class spack.ci.RebuildDecision(rebuild: bool = True, reason: str = '')[source]

Bases: object

spack.ci.can_sign_binaries()[source]

Utility method to determine if this spack instance is capable of signing binary packages. This is currently only possible if the spack gpg keystore contains exactly one secret key.

spack.ci.can_verify_binaries()[source]

Utility method to determin if this spack instance is capable (at least in theory) of verifying signed binaries.

spack.ci.check_for_broken_specs(pipeline_specs: List[Spec], broken_specs_url: str) bool[source]

Check the pipeline specs against the list of known broken specs and return True if there were any matches, False otherwise.

spack.ci.collect_pipeline_options(env: Environment, args) PipelineOptions[source]

Gather pipeline options from cli args, spack environment, and os environment variables

spack.ci.compute_affected_packages(rev1='HEAD^', rev2='HEAD')[source]

Determine which packages were added, removed or changed between rev1 and rev2, and return the names as a set

spack.ci.copy_stage_logs_to_artifacts(job_spec: Spec, job_log_dir: str) None[source]

Copy selected build stage file(s) to the given artifacts directory

Looks for build logs in the stage directory of the given job_spec, and attempts to copy the files into the directory given by job_log_dir.

Parameters:
  • job_spec – spec associated with spack install log

  • job_log_dir – path into which build log should be copied

spack.ci.copy_test_logs_to_artifacts(test_stage, job_test_dir)[source]

Copy test log file(s) to the given artifacts directory

Parameters:
  • test_stage (str) – test stage path

  • job_test_dir (str) – the destination artifacts test directory

spack.ci.create_already_built_pruner(check_index_only: bool = True) Callable[[Spec], RebuildDecision][source]

Return a filter that prunes specs already present on any configured mirrors

spack.ci.create_buildcache(input_spec: Spec, *, destination_mirror_urls: List[str], sign_binaries: bool = False) List[PushResult][source]

Create the buildcache at the provided mirror(s).

Parameters:
  • input_spec – Installed spec to package and push

  • destination_mirror_urls – List of urls to push to

  • sign_binaries – Whether or not to sign buildcache entry

Returns: A list of PushResults, indicating success or failure.

spack.ci.create_external_pruner() Callable[[Spec], RebuildDecision][source]

Return a filter that prunes external specs

spack.ci.create_unaffected_pruner(affected_specs: Set[Spec]) Callable[[Spec], RebuildDecision][source]

Given a set of “affected” specs, return a filter that prunes specs not in the set.

spack.ci.display_broken_spec_messages(base_url, hashes)[source]

Fetch the broken spec file for each of the hashes under the base_url and print a message with some details about each one.

spack.ci.download_and_extract_artifacts(url, work_dir)[source]
Look for gitlab artifacts.zip at the given url, and attempt to download

and extract the contents into the given work_dir

Parameters:
  • url (str) – Complete url to artifacts.zip file

  • work_dir (str) – Path to destination where artifacts should be extracted

spack.ci.generate_pipeline(env: Environment, args) None[source]

Given an environment and the command-line args, generate a pipeline.

Parameters:
  • env (spack.environment.Environment) – Activated environment object which must contain a ci section describing attributes for all jobs and a target which should specify an existing pipeline generator.

  • args – (spack.main.SpackArgumentParser): Parsed arguments from the command line.

spack.ci.get_change_revisions()[source]

If this is a git repo get the revisions to use when checking for changed packages and spack core modules.

spack.ci.get_spack_info()[source]

If spack is running from a git repo, return the most recent git log entry, otherwise, return a string containing the spack version.

spack.ci.get_spec_filter_list(env, affected_pkgs, dependent_traverse_depth=None)[source]
Given a list of package names and an active/concretized

environment, return the set of all concrete specs from the environment that could have been affected by changing the list of packages.

If a dependent_traverse_depth is given, it is used to limit upward (in the parent direction) traversal of specs of touched packages. E.g. if 1 is provided, then only direct dependents of touched package specs are traversed to produce specs that could have been affected by changing the package, while if 0 is provided, only the changed specs themselves are traversed. If None is given, upward traversal of touched package specs is done all the way to the environment roots. Providing a negative number results in no traversals at all, yielding an empty set.

Parameters:
  • env (spack.environment.Environment) – Active concrete environment

  • affected_pkgs (List[str]) – Affected package names

  • dependent_traverse_depth – Optional integer to limit dependent traversal, or None to disable the limit.

Returns:

A set of concrete specs from the active environment including those associated with affected packages, their dependencies and dependents, as well as their dependents dependencies.

spack.ci.get_stack_changed(env_path, rev1='HEAD^', rev2='HEAD')[source]

Given an environment manifest path and two revisions to compare, return whether or not the stack was changed. Returns True if the environment manifest changed between the provided revisions (or additionally if the .gitlab-ci.yml file itself changed). Returns False otherwise.

spack.ci.import_signing_key(base64_signing_key)[source]
Given Base64-encoded gpg key, decode and import it to use for

signing packages.

Parameters:
  • base64_signing_key (str) – A gpg key including the secret key, armor-exported and base64 encoded, so it can be stored in a gitlab CI variable. For an example of how to generate such a key, see:

  • https – //github.com/spack/spack-infrastructure/blob/main/gitlab-docker/files/gen-key

spack.ci.process_command(name, commands, repro_dir, run=True, exit_on_failure=True)[source]

Create a script for and run the command. Copy the script to the reproducibility directory.

Parameters:
  • name (str) – name of the command being processed

  • commands (list) – list of arguments for single command or list of lists of arguments for multiple commands. No shell escape is performed.

  • repro_dir (str) – Job reproducibility directory

  • run (bool) – Run the script and return the exit code if True

Returns: the exit code from processing the command

spack.ci.prune_pipeline(pipeline: PipelineDag, pruning_filters: List[Callable[[Spec], RebuildDecision]], print_summary: bool = False) None[source]

Given a PipelineDag and a list of pruning filters, return a modified PipelineDag containing only the nodes that survive pruning by all of the filters.

spack.ci.push_to_build_cache(spec: Spec, mirror_url: str, sign_binaries: bool) bool[source]

Push one or more binary packages to the mirror.

Parameters:
  • spec – Installed spec to push

  • mirror_url – URL of target mirror

  • sign_binaries – If True, spack will attempt to sign binary package before pushing.

spack.ci.read_broken_spec(broken_spec_url)[source]

Read data from broken specs file located at the url, return as a yaml object.

spack.ci.reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime)[source]

Given a url to gitlab artifacts.zip from a failed ‘spack ci rebuild’ job, attempt to setup an environment in which the failure can be reproduced locally. This entails the following:

First download and extract artifacts. Then look through those artifacts to glean some information needed for the reproduer (e.g. one of the artifacts contains information about the version of spack tested by gitlab, another is the generated pipeline yaml containing details of the job like the docker image used to run it). The output of this function is a set of printed instructions for running docker and then commands to run to reproduce the build once inside the container.

spack.ci.run_standalone_tests(**kwargs)[source]

Run stand-alone tests on the current spec.

Parameters:

kwargs (dict) – dictionary of arguments used to run the tests

List of recognized keys:

  • “cdash” (CDashHandler): (optional) cdash handler instance

  • “fail_fast” (bool): (optional) terminate tests after the first failure

  • “log_file” (str): (optional) test log file name if NOT CDash reporting

  • “job_spec” (Spec): spec that was built

  • “repro_dir” (str): reproduction directory

spack.ci.setup_spack_repro_version(repro_dir, checkout_commit, merge_commit=None)[source]
Look in the local spack clone to find the checkout_commit, and if

provided, the merge_commit given as arguments. If those commits can be found locally, then clone spack and attempt to recreate a merge commit with the same parent commits as tested in gitlab. This looks something like 1) git clone repo && cd repo 2) git checkout <checkout_commit> 3) git merge <merge_commit>. If there is no merge_commit provided, then skip step (3).

Parameters:
  • repro_dir (str) – Location where spack should be cloned

  • checkout_commit (str) – SHA of PR branch commit

  • merge_commit (str) – SHA of target branch parent

Returns: True if git repo state was successfully recreated, or False

otherwise.

spack.ci.write_broken_spec(url, pkg_name, stack_name, job_url, pipeline_url, spec_dict)[source]

Given a url to write to and the details of the failed job, write an entry in the broken specs list.

Submodules

spack.ci.common module

class spack.ci.common.CDashHandler(ci_cdash)[source]

Bases: object

Class for managing CDash data and processing.

args()[source]
build_name(spec: Spec | None = None) str | None[source]

Returns the CDash build name.

A name will be generated if the spec is provided, otherwise, the value will be retrieved from the environment through the SPACK_CDASH_BUILD_NAME variable.

Returns: (str) given spec’s CDash build name.

property build_stamp

Returns the CDash build stamp.

The one defined by SPACK_CDASH_BUILD_STAMP environment variable is preferred due to the representation of timestamps; otherwise, one will be built.

Returns: (str) current CDash build stamp

copy_test_results(source, dest)[source]

Copy test results to artifacts directory.

create_buildgroup(opener, headers, url, group_name, group_type)[source]
populate_buildgroup(job_names)[source]
property project_enc
report_skipped(spec: Spec, report_dir: str, reason: str | None)[source]

Explicitly report skipping testing of a spec (e.g., it’s CI configuration identifies it as known to have broken tests or the CI installation failed).

Parameters:
  • spec – spec being tested

  • report_dir – directory where the report will be written

  • reason – reason the test is being skipped

property upload_url
class spack.ci.common.PipelineDag(specs: List[Spec])[source]

Bases: object

Turn a list of specs into a simple directed graph, that doesn’t keep track of edge types.

get_dependencies(node: PipelineNode) List[PipelineNode][source]

Returns a list of nodes corresponding to the direct dependencies of the given node.

classmethod key(spec: Spec) str[source]
prune(node_key: str)[source]

Remove a node from the graph, and reconnect its parents and children

traverse_nodes(direction: str = 'children') Generator[Tuple[int, PipelineNode], None, None][source]

Yields (depth, node) from the pipeline graph. Traversal is topologically ordered from the roots if direction is children, or from the leaves if direction is parents. The yielded depth is the length of the longest path from the starting point to the yielded node.

class spack.ci.common.PipelineNode(spec: Spec)[source]

Bases: object

children: Set[str]
property key

Return key of the stored spec

parents: Set[str]
spec: Spec
class spack.ci.common.PipelineOptions(env: Environment, buildcache_destination: Mirror, artifacts_root: str = 'jobs_scratch_dir', print_summary: bool = True, output_file: str | None = None, check_index_only: bool = False, broken_specs_url: str | None = None, rebuild_index: bool = True, untouched_pruning_dependent_depth: int | None = None, prune_untouched: bool = False, prune_up_to_date: bool = True, prune_external: bool = True, stack_name: str | None = None, pipeline_type: PipelineType | None = None, require_signing: bool = False, cdash_handler: CDashHandler | None = None)[source]

Bases: object

A container for all pipeline options that can be specified (whether via cli, config/yaml, or environment variables)

class spack.ci.common.PipelineType(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]

Bases: Enum

COPY_ONLY = 1
PROTECTED_BRANCH = 2
PULL_REQUEST = 3
spack_copy_only = 1
spack_protected_branch = 2
spack_pull_request = 3
class spack.ci.common.SpackCIConfig(ci_config)[source]

Bases: object

Spack CI object used to generate intermediate representation used by the CI generator(s).

generate_ir()[source]

Generate the IR from the Spack CI configurations.

init_pipeline_jobs(pipeline: PipelineDag)[source]
exception spack.ci.common.SpackCIError(msg)[source]

Bases: SpackError

spack.ci.common.copy_files_to_artifacts(src, artifacts_dir)[source]

Copy file(s) to the given artifacts directory

Parameters:
  • src (str) – the glob-friendly path expression for the file(s) to copy

  • artifacts_dir (str) – the destination directory

spack.ci.common.ensure_expected_target_path(path: str) str[source]

Returns passed paths with all Windows path separators exchanged for posix separators

TODO (johnwparent): Refactor config + cli read/write to deal only in posix style paths

spack.ci.common.unpack_script(script_section, op=<function _noop>)[source]
spack.ci.common.update_env_scopes(env: Environment, cli_scopes: List[str], output_file: str, transform_windows_paths: bool = False) None[source]

Add any config scopes from cli_scopes which aren’t already included in the environment, by reading the yaml, adding the missing includes, and writing the updated yaml back to the same location.

spack.ci.common.win_quote(quote_str: str) str[source]
spack.ci.common.write_pipeline_manifest(specs, src_prefix, dest_prefix, output_file)[source]

Write out the file describing specs that should be copied

spack.ci.generator_registry module

Generators that support writing out pipelines for various CI platforms, using a common pipeline graph definition.

exception spack.ci.generator_registry.UnknownGeneratorException(generator_name)[source]

Bases: SpackError

spack.ci.generator_registry.generator(name)[source]

Decorator to register a pipeline generator method. A generator method should take PipelineDag, SpackCIConfig, and PipelineOptions arguments, and should produce a pipeline file.

spack.ci.generator_registry.get_generator(name)[source]

spack.ci.gitlab module

spack.ci.gitlab.generate_gitlab_yaml(pipeline: PipelineDag, spack_ci: SpackCIConfig, options: PipelineOptions)[source]

Given a pipeline graph, job attributes, and pipeline options, write a pipeline that can be consumed by GitLab to the given output file.

Parameters:
  • pipeline – An already pruned graph of jobs representing all the specs to build

  • spack_ci – An object containing the configured attributes of all jobs in the pipeline

  • options – An object containing all the pipeline options gathered from yaml, env, etc…

spack.ci.gitlab.get_job_name(spec: Spec, build_group: str | None = None) str[source]

Given a spec and possibly a build group, return the job name. If the resulting name is longer than 255 characters, it will be truncated.

Parameters:
  • spec – Spec job will build

  • build_group – Name of build group this job belongs to (a CDash notion)

Returns: The job name

spack.ci.gitlab.maybe_generate_manifest(pipeline: PipelineDag, options: PipelineOptions, manifest_path)[source]