spack package

spack.spack_version_info = (0, 22, 0, 'dev0')

(major, minor, micro, dev release) tuple

Subpackages

Submodules

spack.abi module

class spack.abi.ABI[source]

Bases: object

This class provides methods to test ABI compatibility between specs. The current implementation is rather rough and could be improved.

architecture_compatible(target: Spec, constraint: Spec) bool[source]

Return true if architecture of target spec is ABI compatible to the architecture of constraint spec. If either the target or constraint specs have no architecture, target is also defined as architecture ABI compatible to constraint.

compatible(target: Spec, constraint: Spec, loose: bool = False) bool[source]

Returns true if target spec is ABI compatible to constraint spec

compiler_compatible(parent: Spec, child: Spec, loose: bool = False) bool[source]

Return true if compilers for parent and child are ABI compatible.

spack.audit module

Classes and functions to register audit checks for various parts of Spack and run them on-demand.

To register a new class of sanity checks (e.g. sanity checks for compilers.yaml), the first action required is to create a new AuditClass object:

audit_cfgcmp = AuditClass(
    tag='CFG-COMPILER',
    description='Sanity checks on compilers.yaml',
    kwargs=()
)

This object is to be used as a decorator to register functions that will perform each a single check:

@audit_cfgcmp
def _search_duplicate_compilers(error_cls):
    pass

These functions need to take as argument the keywords declared when creating the decorator object plus an error_cls argument at the end, acting as a factory to create Error objects. It should return a (possibly empty) list of errors.

Calls to each of these functions are triggered by the run method of the decorator object, that will forward the keyword arguments passed as input.

class spack.audit.AuditClass(group, tag, description, kwargs)[source]

Bases: Sequence

run(**kwargs)[source]
spack.audit.CALLBACKS = {'CFG-COMPILER': <spack.audit.AuditClass object>, 'CFG-PACKAGES': <spack.audit.AuditClass object>, 'GENERIC': <spack.audit.AuditClass object>, 'PKG-ATTRIBUTES': <spack.audit.AuditClass object>, 'PKG-DIRECTIVES': <spack.audit.AuditClass object>, 'PKG-EXTERNALS': <spack.audit.AuditClass object>, 'PKG-HTTPS-DIRECTIVES': <spack.audit.AuditClass object>, 'PKG-PROPERTIES': <spack.audit.AuditClass object>}

Map an audit tag to a list of callables implementing checks

class spack.audit.Error(summary, details)[source]

Bases: object

Information on an error reported in a test.

spack.audit.GROUPS = {'configs': ['CFG-COMPILER', 'CFG-PACKAGES'], 'externals': ['PKG-EXTERNALS'], 'generic': ['GENERIC'], 'packages': ['PKG-DIRECTIVES', 'PKG-ATTRIBUTES', 'PKG-PROPERTIES'], 'packages-https': ['PKG-HTTPS-DIRECTIVES']}

Map a group of checks to the list of related audit tags

spack.audit.config_compiler = <spack.audit.AuditClass object>

Sanity checks on compilers.yaml

spack.audit.config_packages = <spack.audit.AuditClass object>

Sanity checks on packages.yaml

spack.audit.external_detection = <spack.audit.AuditClass object>

Sanity checks on package directives

spack.audit.generic = <spack.audit.AuditClass object>

Generic checks relying on global state

spack.audit.package_directives = <spack.audit.AuditClass object>

Sanity checks on package directives

spack.audit.packages_with_detection_tests()[source]

Return the list of packages with a corresponding detection_test.yaml file.

spack.audit.run_check(tag, **kwargs)[source]

Run the checks associated with a single tag.

Parameters:
  • tag (str) – tag of the check

  • **kwargs – keyword arguments forwarded to the checks

Returns:

Errors occurred during the checks

spack.audit.run_group(group, **kwargs)[source]

Run the checks that are part of the group passed as argument.

Parameters:
  • group (str) – group of checks to be run

  • **kwargs – keyword arguments forwarded to the checks

Returns:

List of (tag, errors) that failed.

spack.binary_distribution module

spack.binary_distribution.BINARY_INDEX: BinaryCacheIndex = <spack.binary_distribution.BinaryCacheIndex object>

Default binary cache index instance

class spack.binary_distribution.BinaryCacheIndex(cache_root: str | None = None)[source]

Bases: object

The BinaryCacheIndex tracks what specs are available on (usually remote) binary caches.

This index is “best effort”, in the sense that whenever we don’t find what we’re looking for here, we will attempt to fetch it directly from configured mirrors anyway. Thus, it has the potential to speed things up, but cache misses shouldn’t break any spack functionality.

At the moment, everything in this class is initialized as lazily as possible, so that it avoids slowing anything in spack down until absolutely necessary.

TODO: What’s the cost if, e.g., we realize in the middle of a spack install that the cache is out of date, and we fetch directly? Does it mean we should have paid the price to update the cache earlier?

clear()[source]

For testing purposes we need to be able to empty the cache and clear associated data structures.

find_built_spec(spec, mirrors_to_check=None)[source]

Look in our cache for the built spec corresponding to spec.

If the spec can be found among the configured binary mirrors, a list is returned that contains the concrete spec and the mirror url of each mirror where it can be found. Otherwise, None is returned.

This method does not trigger reading anything from remote mirrors, but rather just checks if the concrete spec is found within the cache.

The cache can be updated by calling update() on the cache.

Parameters:
  • spec (spack.spec.Spec) – Concrete spec to find

  • mirrors_to_check – Optional mapping containing mirrors to check. If None, just assumes all configured mirrors.

Returns:

An list of objects containing the found specs and mirror url where

each can be found, e.g.:

[
    {
        "spec": <concrete-spec>,
        "mirror_url": <mirror-root-url>
    }
]

find_by_hash(find_hash, mirrors_to_check=None)[source]

Same as find_built_spec but uses the hash of a spec.

Parameters:
  • find_hash (str) – hash of the spec to search

  • mirrors_to_check – Optional mapping containing mirrors to check. If None, just assumes all configured mirrors.

get_all_built_specs()[source]
regenerate_spec_cache(clear_existing=False)[source]

Populate the local cache of concrete specs (_mirrors_for_spec) from the locally cached buildcache index files. This is essentially a no-op if it has already been done, as we keep track of the index hashes for which we have already associated the built specs.

update(with_cooldown=False)[source]

Make sure local cache of buildcache index files is up to date. If the same mirrors are configured as the last time this was called and none of the remote buildcache indices have changed, calling this method will only result in fetching the index hash from each mirror to confirm it is the same as what is stored locally. Otherwise, the buildcache index.json and index.json.hash files are retrieved from each configured mirror and stored locally (both in memory and on disk under _index_cache_root).

update_spec(spec, found_list)[source]

Take list of {‘mirror_url’: m, ‘spec’: s} objects and update the local built_spec_cache

class spack.binary_distribution.BinaryCacheQuery(all_architectures)[source]

Bases: object

Callable object to query if a spec is in a binary cache

class spack.binary_distribution.BuildCacheDatabase(root)[source]

Bases: Database

A database for binary buildcaches.

A database supports writing buildcache index files, in which case certain fields are not needed in each install record, and no locking is required. To use this feature, it provides lock_cfg=NO_LOCK, and override the list of record_fields.

record_fields: Tuple[str, ...] = ('spec', 'ref_count', 'in_buildcache')

Fields written for each install record

class spack.binary_distribution.BuildManifestVisitor[source]

Bases: BaseDirectoryVisitor

Visitor that collects a list of files and symlinks that can be checked for need of relocation. It knows how to dedupe hardlinks and deal with symlinks to files and directories.

before_visit_dir(root, rel_path, depth)[source]

Return True from this function to recurse into the directory at os.path.join(root, rel_path). Return False in order not to recurse further.

Parameters:
  • root – root directory

  • rel_path – relative path to current directory from root

  • depth – depth of current directory from the root directory

Returns:

True when the directory should be recursed into. False when not

Return type:

bool

before_visit_symlinked_dir(root, rel_path, depth)[source]

Return True to recurse into the symlinked directory and False in order not to. Note: rel_path is the path to the symlink itself. Following symlinked directories blindly can cause infinite recursion due to cycles.

Parameters:
  • root – root directory

  • rel_path – relative path to current symlink from root

  • depth – depth of current symlink from the root directory

Returns:

True when the directory should be recursed into. False when not

Return type:

bool

seen_before(root, rel_path)[source]
visit_file(root, rel_path, depth)[source]

Handle the non-symlink file at os.path.join(root, rel_path)

Parameters:
  • root – root directory

  • rel_path – relative path to current file from root

  • depth (int) – depth of current file from the root directory

visit_symlinked_file(root, rel_path, depth)[source]

Handle the symlink to a file at os.path.join(root, rel_path). Note: rel_path is the location of the symlink, not to what it is pointing to. The symlink may be dangling.

Parameters:
  • root – root directory

  • rel_path – relative path to current symlink from root

  • depth – depth of current symlink from the root directory

exception spack.binary_distribution.BuildcacheIndexError(message, long_message=None)[source]

Bases: SpackError

Raised when a buildcache cannot be read for any reason

spack.binary_distribution.CURRENT_BUILD_CACHE_LAYOUT_VERSION = 2

The build cache layout version that this version of Spack creates. Version 2: includes parent directories of the package prefix in the tarball

class spack.binary_distribution.DefaultIndexFetcher(url, local_hash, urlopen=<function _urlopen.<locals>.dispatch_open>)[source]

Bases: object

Fetcher for index.json, using separate index.json.hash as cache invalidation strategy

conditional_fetch() FetchIndexResult[source]
get_remote_hash()[source]
class spack.binary_distribution.EtagIndexFetcher(url, etag, urlopen=<function _urlopen.<locals>.dispatch_open>)[source]

Bases: object

Fetcher for index.json, using ETags headers as cache invalidation strategy

conditional_fetch() FetchIndexResult[source]
exception spack.binary_distribution.FetchCacheError(errors)[source]

Bases: Exception

Error thrown when fetching the cache failed, usually a composite error list.

exception spack.binary_distribution.FetchIndexError[source]

Bases: Exception

class spack.binary_distribution.FetchIndexResult(etag, hash, data, fresh)

Bases: tuple

data

Alias for field number 2

etag

Alias for field number 0

fresh

Alias for field number 3

hash

Alias for field number 1

exception spack.binary_distribution.InvalidMetadataFile(message, long_message=None)[source]

Bases: SpackError

exception spack.binary_distribution.ListMirrorSpecsError(message, long_message=None)[source]

Bases: SpackError

Raised when unable to retrieve list of specs from the mirror

exception spack.binary_distribution.NewLayoutException(msg)[source]

Bases: SpackError

Raised if directory layout is different from buildcache.

exception spack.binary_distribution.NoChecksumException(path, size, contents, algorithm, expected, computed)[source]

Bases: SpackError

Raised if file fails checksum verification.

exception spack.binary_distribution.NoGpgException(msg)[source]

Bases: SpackError

Raised when gpg2 is not in PATH

exception spack.binary_distribution.NoKeyException(msg)[source]

Bases: SpackError

Raised when gpg has no default key added.

exception spack.binary_distribution.NoOverwriteException(file_path)[source]

Bases: SpackError

Raised when a file would be overwritten

exception spack.binary_distribution.NoVerifyException(message, long_message=None)[source]

Bases: SpackError

Raised if file fails signature verification.

exception spack.binary_distribution.NotInstalledError(specs: List[Spec])[source]

Bases: SpackError

Raised when a spec is not installed but picked to be packaged.

class spack.binary_distribution.OCIIndexFetcher(url: str, local_hash, urlopen=None)[source]

Bases: object

conditional_fetch() FetchIndexResult[source]

Download an index from an OCI registry type mirror.

exception spack.binary_distribution.PickKeyException(keys)[source]

Bases: SpackError

Raised when multiple keys can be used to sign.

class spack.binary_distribution.PushOptions(force, regenerate_index, unsigned, key)[source]

Bases: NamedTuple

force: bool

Overwrite existing tarball/metadata files in buildcache

key: str | None

What key to use for signing

regenerate_index: bool

Regenerated indices after pushing

unsigned: bool

Whether to sign or not.

exception spack.binary_distribution.UnsignedPackageException(message, long_message=None)[source]

Bases: SpackError

Raised if installation of unsigned package is attempted without the use of --no-check-signature.

spack.binary_distribution.binary_index_location()[source]

Set up a BinaryCacheIndex for remote buildcache dbs in the user’s homedir.

spack.binary_distribution.build_cache_keys_relative_path()[source]
spack.binary_distribution.build_cache_prefix(prefix)[source]
spack.binary_distribution.build_cache_relative_path()[source]
spack.binary_distribution.buildinfo_file_name(prefix)[source]

Filename of the binary package meta-data file

spack.binary_distribution.check_specs_against_mirrors(mirrors, specs, output_file=None)[source]

Check all the given specs against buildcaches on the given mirrors and determine if any of the specs need to be rebuilt. Specs need to be rebuilt when their hash doesn’t exist in the mirror.

Parameters:
  • mirrors (dict) – Mirrors to check against

  • specs (Iterable) – Specs to check against mirrors

  • output_file (str) – Path to output file to be written. If provided, mirrors with missing or out-of-date specs will be formatted as a JSON object and written to this file.

Returns: 1 if any spec was out-of-date on any mirror, 0 otherwise.

spack.binary_distribution.clear_spec_cache()[source]
spack.binary_distribution.compute_hash(data)[source]

Updates a buildinfo dict for old archives that did not dedupe hardlinks. De-duping hardlinks is necessary when relocating files in parallel and in-place. This means we must preserve inodes when relocating.

spack.binary_distribution.download_buildcache_entry(file_descriptions, mirror_url=None)[source]
spack.binary_distribution.download_single_spec(concrete_spec, destination, mirror_url=None)[source]

Download the buildcache files for a single concrete spec.

Parameters:
  • concrete_spec – concrete spec to be downloaded

  • destination (str) – path where to put the downloaded buildcache

  • mirror_url (str) – url of the mirror from which to download

spack.binary_distribution.download_tarball(spec, unsigned: bool | None = False, mirrors_for_spec=None)[source]

Download binary tarball for given package into stage area, returning path to downloaded tarball if successful, None otherwise.

Parameters:
  • spec (spack.spec.Spec) – Concrete spec

  • unsigned – if True or False override the mirror signature verification defaults

  • mirrors_for_spec (list) – Optional list of concrete specs and mirrors obtained by calling binary_distribution.get_mirrors_for_spec(). These will be checked in order first before looking in other configured mirrors.

Returns:

None if the tarball could not be downloaded (maybe also verified, depending on whether new-style signed binary packages were found). Otherwise, return an object indicating the path to the downloaded tarball, the path to the downloaded specfile (in the case of new-style buildcache), and whether or not the tarball is already verified.

{
    "tarball_path": "path-to-locally-saved-tarfile",
    "specfile_path": "none-or-path-to-locally-saved-specfile",
    "signature_verified": "true-if-binary-pkg-was-already-verified"
}
spack.binary_distribution.extract_tarball(spec, download_result, force=False, timer=<spack.util.timer.NullTimer object>)[source]

extract binary tarball for given package into install area

spack.binary_distribution.file_matches(path, regex)[source]
spack.binary_distribution.generate_key_index(key_prefix, tmpdir=None)[source]

Create the key index page.

Creates (or replaces) the “index.json” page at the location given in key_prefix. This page contains an entry for each key (.pub) under key_prefix.

spack.binary_distribution.generate_package_index(cache_prefix, concurrency=32)[source]

Create or replace the build cache index on the given mirror. The buildcache index contains an entry for each binary package under the cache_prefix.

Parameters:
  • cache_prefix (str) – Base url of binary mirror.

  • concurrency – (int): The desired threading concurrency to use when fetching the spec files from the mirror.

Returns:

None

spack.binary_distribution.get_buildfile_manifest(spec)[source]

Return a data structure with information about a build, including text_to_relocate, binary_to_relocate, binary_to_relocate_fullpath link_to_relocate, and other, which means it doesn’t fit any of previous checks (and should not be relocated). We exclude docs (man) and metadata (.spack). This can be used to find a particular kind of file in spack, or to generate the build metadata.

spack.binary_distribution.get_buildinfo_dict(spec)[source]

Create metadata for a tarball

spack.binary_distribution.get_keys(install=False, trust=False, force=False, mirrors=None)[source]

Get pgp public keys available on mirror with suffix .pub

spack.binary_distribution.get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False)[source]

Check if concrete spec exists on mirrors and return a list indicating the mirrors on which it can be found

Parameters:
  • spec (spack.spec.Spec) – The spec to look for in binary mirrors

  • mirrors_to_check (dict) – Optionally override the configured mirrors with the mirrors in this dictionary.

  • index_only (bool) – When index_only is set to True, only the local cache is checked, no requests are made.

Returns:

A list of objects, each containing a mirror_url and spec key

indicating all mirrors where the spec can be found.

spack.binary_distribution.hashes_to_prefixes(spec)[source]

Return a dictionary of hashes to prefixes for a spec and its deps, excluding externals

spack.binary_distribution.install_root_node(spec, unsigned=False, force=False, sha256=None)[source]

Install the root node of a concrete spec from a buildcache.

Checking the sha256 sum of a node before installation is usually needed only for software installed during Spack’s bootstrapping (since we might not have a proper signature verification mechanism available).

Parameters:
  • spec – spec to be installed (note that only the root node will be installed)

  • unsigned (bool) – if True allows installing unsigned binaries

  • force (bool) – force installation if the spec is already present in the local store

  • sha256 (str) – optional sha256 of the binary package, to be checked before installation

spack.binary_distribution.install_single_spec(spec, unsigned=False, force=False)[source]

Install a single concrete spec from a buildcache.

Parameters:
  • spec (spack.spec.Spec) – spec to be installed

  • unsigned (bool) – if True allows installing unsigned binaries

  • force (bool) – force installation if the spec is already present in the local store

spack.binary_distribution.needs_rebuild(spec, mirror_url)[source]
spack.binary_distribution.push(spec: Spec, mirror_url: str, options: PushOptions)[source]

Create and push binary package for a single spec to the specified mirror url.

Parameters:
  • spec – Spec to package and push

  • mirror_url – Desired destination url for binary package

  • options

Returns:

True if package was pushed, False otherwise.

spack.binary_distribution.push_keys(*mirrors, **kwargs)[source]

Upload pgp public keys to the given mirrors

spack.binary_distribution.push_or_raise(spec: Spec, out_url: str, options: PushOptions)[source]

Build a tarball from given spec and put it into the directory structure used at the mirror (following <tarball_directory_name>).

This method raises NoOverwriteException when force=False and the tarball or spec.json file already exist in the buildcache.

spack.binary_distribution.read_buildinfo_file(prefix)[source]

Read buildinfo file

spack.binary_distribution.relocate_package(spec)[source]

Relocate the given package

spack.binary_distribution.select_signing_key(key=None)[source]
spack.binary_distribution.sign_specfile(key, force, specfile_path)[source]
spack.binary_distribution.specs_to_be_packaged(specs: List[Spec], root: bool = True, dependencies: bool = True) List[Spec][source]

Return the list of nodes to be packaged, given a list of specs. Raises NotInstalledError if a spec is not installed but picked to be packaged.

Parameters:
  • specs – list of root specs to be processed

  • root – include the root of each spec in the nodes

  • dependencies – include the dependencies of each spec in the nodes

spack.binary_distribution.tarball_directory_name(spec)[source]

Return name of the tarball directory according to the convention <os>-<architecture>/<compiler>/<package>-<version>/

spack.binary_distribution.tarball_name(spec, ext)[source]

Return the name of the tarfile according to the convention <os>-<architecture>-<package>-<dag_hash><ext>

spack.binary_distribution.tarball_path_name(spec, ext)[source]

Return the full path+name for a given spec according to the convention <tarball_directory_name>/<tarball_name>

spack.binary_distribution.tarfile_of_spec_prefix(tar: TarFile, prefix: str) None[source]

Create a tarfile of an install prefix of a spec. Skips existing buildinfo file.

Parameters:
  • tar – tarfile object to add files to

  • prefix – absolute install prefix of spec

spack.binary_distribution.try_direct_fetch(spec, mirrors=None)[source]

Try to find the spec directly on the configured mirrors

spack.binary_distribution.try_fetch(url_to_fetch)[source]

Utility function to try and fetch a file from a url, stage it locally, and return the path to the staged file.

Parameters:

url_to_fetch (str) – Url pointing to remote resource to fetch

Returns:

Path to locally staged resource or None if it could not be fetched.

spack.binary_distribution.try_verify(specfile_path)[source]

Utility function to attempt to verify a local file. Assumes the file is a clearsigned signature file.

Parameters:

specfile_path (str) – Path to file to be verified.

Returns:

True if the signature could be verified, False otherwise.

spack.binary_distribution.update_cache_and_get_specs()[source]

Get all concrete specs for build caches available on configured mirrors. Initialization of internal cache data structures is done as lazily as possible, so this method will also attempt to initialize and update the local index cache (essentially a no-op if it has been done already and nothing has changed on the configured mirrors.)

Throws:

FetchCacheError

spack.build_environment module

This module contains all routines related to setting up the package build environment. All of this is set up by package.py just before install() is called.

There are two parts to the build environment:

  1. Python build environment (i.e. install() method)

    This is how things are set up when install() is called. Spack takes advantage of each package being in its own module by adding a bunch of command-like functions (like configure(), make(), etc.) in the package’s module scope. Ths allows package writers to call them all directly in Package.install() without writing ‘self.’ everywhere. No, this isn’t Pythonic. Yes, it makes the code more readable and more like the shell script from which someone is likely porting.

  2. Build execution environment

    This is the set of environment variables, like PATH, CC, CXX, etc. that control the build. There are also a number of environment variables used to pass information (like RPATHs and other information about dependencies) to Spack’s compiler wrappers. All of these env vars are also set up here.

Skimming this module is a nice way to get acquainted with the types of calls you can make from within the install() function.

exception spack.build_environment.ChildError(msg, module, classname, traceback_string, log_name, log_type, context)[source]

Bases: InstallError

Special exception class for wrapping exceptions from child processes

in Spack’s build environment.

The main features of a ChildError are:

  1. They’re serializable, so when a child build fails, we can send one of these to the parent and let the parent report what happened.

  2. They have a traceback field containing a traceback generated on the child immediately after failure. Spack will print this on failure in lieu of trying to run sys.excepthook on the parent process, so users will see the correct stack trace from a child.

  3. They also contain context, which shows context in the Package implementation where the error happened. This helps people debug Python code in their packages. To get it, Spack searches the stack trace for the deepest frame where self is in scope and is an instance of PackageBase. This will generally find a useful spot in the package.py file.

The long_message of a ChildError displays one of two things:

  1. If the original error was a ProcessError, indicating a command died during the build, we’ll show context from the build log.

  2. If the original error was any other type of error, we’ll show context from the Python code.

SpackError handles displaying the special traceback if we’re in debug mode with spack -d.

build_errors = [('spack.util.executable', 'ProcessError')]
property long_message
class spack.build_environment.EnvironmentVisitor(*roots: Spec, context: Context)[source]

Bases: object

neighbors(item)[source]
class spack.build_environment.MakeExecutable(name, jobs, **kwargs)[source]

Bases: Executable

Special callable executable object for make so the user can specify parallelism options on a per-invocation basis. Specifying ‘parallel’ to the call will override whatever the package’s global setting is, so you can either default to true or false and override particular calls. Specifying ‘jobs_env’ to a particular call will name an environment variable which will be set to the parallelism level (without affecting the normal invocation with -j).

class spack.build_environment.ModuleChangePropagator(package)[source]

Bases: object

Wrapper class to accept changes to a package.py Python module, and propagate them in the MRO of the package.

It is mainly used as a substitute of the package.py module, when calling the “setup_dependent_package” function during build environment setup.

propagate_changes_to_mro()[source]
class spack.build_environment.SetupContext(*specs: Spec, context: Context)[source]

Bases: object

This class encapsulates the logic to determine environment modifications, and is used as well to set globals in modules of package.py.

get_env_modifications() EnvironmentModifications[source]

Returns the environment variable modifications for the given input specs and context. Environment modifications include: - Updating PATH for packages that are required at runtime - Updating CMAKE_PREFIX_PATH and PKG_CONFIG_PATH so that their respective tools can find Spack-built dependencies (when context=build) - Running custom package environment modifications: setup_run_environment, setup_dependent_run_environment, setup_build_environment, setup_dependent_build_environment.

The (partial) order imposed on the specs is externals first, then topological from leaf to root. That way externals cannot contribute search paths that would shadow Spack’s prefixes, and dependents override variables set by dependencies.

set_all_package_py_globals()[source]

Set the globals in modules of package.py files.

exception spack.build_environment.StopPhase(message, long_message=None)[source]

Bases: SpackError

Pickle-able exception to control stopped builds.

class spack.build_environment.UseMode(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]

Bases: Flag

ADDED = 32

Flag is set when the (node, mode) is finalized

BUILDTIME = 16

A spec that should be visible in search paths in a build env.

BUILDTIME_DIRECT = 8

A spec that’s a direct build or test dep

ROOT = 1

Entrypoint spec (a spec to be built; an env root, etc)

RUNTIME = 2

A spec used at runtime, but no executables in PATH

RUNTIME_EXECUTABLE = 4

A spec used at runtime, with executables in PATH

spack.build_environment.clean_environment()[source]
spack.build_environment.effective_deptypes(*specs: Spec, context: Context = Context.BUILD) List[Tuple[Spec, UseMode]][source]

Given a list of input specs and a context, return a list of tuples of all specs that contribute to (environment) modifications, together with a flag specifying in what way they do so. The list is ordered topologically from root to leaf, meaning that environment modifications should be applied in reverse so that dependents override dependencies, not the other way around.

spack.build_environment.get_cmake_prefix_path(pkg)[source]
spack.build_environment.get_effective_jobs(jobs, parallel=True, supports_jobserver=False)[source]

Return the number of jobs, or None if supports_jobserver and a jobserver is detected.

spack.build_environment.get_package_context(traceback, context=3)[source]

Return some context for an error message when the build fails.

Parameters:
  • traceback – A traceback from some exception raised during install

  • context (int) – Lines of context to show before and after the line where the error happened

This function inspects the stack to find where we failed in the package file, and it adds detailed context to the long_message from there.

spack.build_environment.get_rpath_deps(pkg)[source]

Return immediate or transitive RPATHs depending on the package.

spack.build_environment.get_rpaths(pkg)[source]

Get a list of all the rpaths for a package.

spack.build_environment.jobserver_enabled()[source]

Returns true if a posix jobserver (make) is detected.

spack.build_environment.load_external_modules(pkg)[source]

Traverse a package’s spec DAG and load any external modules.

Traverse a package’s dependencies and load any external modules associated with them.

Parameters:

pkg (spack.package_base.PackageBase) – package to load deps for

spack.build_environment.set_compiler_environment_variables(pkg, env)[source]
spack.build_environment.set_package_py_globals(pkg, context: Context = Context.BUILD)[source]

Populate the Python module of a package with some useful global names. This makes things easier for package writers.

spack.build_environment.set_wrapper_variables(pkg, env)[source]

Set environment variables used by the Spack compiler wrapper (which have the prefix SPACK_) and also add the compiler wrappers to PATH.

This determines the injected -L/-I/-rpath options; each of these specifies a search order and this function computes these options in a manner that is intended to match the DAG traversal order in SetupContext. TODO: this is not the case yet, we’re using post order, SetupContext is using topo order.

spack.build_environment.setup_package(pkg, dirty, context: Context = Context.BUILD)[source]

Execute all environment setup routines.

spack.build_environment.start_build_process(pkg, function, kwargs)[source]

Create a child process to do part of a spack build.

Parameters:

Usage:

def child_fun():
    # do stuff
build_env.start_build_process(pkg, child_fun)

The child process is run with the build environment set up by spack.build_environment. This allows package authors to have full control over the environment, etc. without affecting other builds that might be executed in the same spack call.

If something goes wrong, the child process catches the error and passes it to the parent wrapped in a ChildError. The parent is expected to handle (or re-raise) the ChildError.

This uses multiprocessing.Process to create the child process. The mechanism used to create the process differs on different operating systems and for different versions of Python. In some cases “fork” is used (i.e. the “fork” system call) and some cases it starts an entirely new Python interpreter process (in the docs this is referred to as the “spawn” start method). Breaking it down by OS:

  • Linux always uses fork.

  • Mac OS uses fork before Python 3.8 and “spawn” for 3.8 and after.

  • Windows always uses the “spawn” start method.

For more information on multiprocessing child process creation mechanisms, see https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods

spack.build_environment.write_log_summary(out, log_type, log, last=None)[source]

spack.builder module

spack.builder.BUILDER_CLS = {'cmake': <class 'spack.build_systems.cmake.CMakeBuilder'>, 'meson': <class 'spack.build_systems.meson.MesonBuilder'>, 'python_pip': <class 'spack.build_systems.python.PythonPipBuilder'>}

Builder classes, as registered by the “builder” decorator

class spack.builder.Builder(pkg)[source]

Bases: Sequence

A builder is a class that, given a package object (i.e. associated with concrete spec), knows how to install it.

The builder behaves like a sequence, and when iterated over return the “phases” of the installation in the correct order.

Parameters:

pkg (spack.package_base.PackageBase) – package object to be built

archive_files: List[str] = []

List of glob expressions. Each expression must either be absolute or relative to the package source path. Matching artifacts found at the end of the build process will be copied in the same directory tree as _spack_build_logfile and _spack_build_envfile.

build_system: str | None = None

Build system name. Must also be defined in derived classes.

build_time_test_callbacks: List[str]
install_time_test_callbacks: List[str]
legacy_attributes: Tuple[str, ...] = ()
legacy_methods: Tuple[str, ...] = ()
phases: Tuple[str, ...] = ()

Sequence of phases. Must be defined in derived classes

property prefix
run_after_callbacks = []
run_before_callbacks = []
setup_build_environment(env)[source]

Sets up the build environment for a package.

This method will be called before the current package prefix exists in Spack’s store.

Parameters:

env (spack.util.environment.EnvironmentModifications) – environment modifications to be applied when the package is built. Package authors can call methods on it to alter the build environment.

setup_dependent_build_environment(env, dependent_spec)[source]

Sets up the build environment of packages that depend on this one.

This is similar to setup_build_environment, but it is used to modify the build environments of packages that depend on this one.

This gives packages like Python and others that follow the extension model a way to implement common environment or compile-time settings for dependencies.

This method will be called before the dependent package prefix exists in Spack’s store.

Examples

1. Installing python modules generally requires PYTHONPATH to point to the lib/pythonX.Y/site-packages directory in the module’s install prefix. This method could be used to set that variable.

Parameters:
  • env (spack.util.environment.EnvironmentModifications) – environment modifications to be applied when the dependent package is built. Package authors can call methods on it to alter the build environment.

  • dependent_spec (spack.spec.Spec) – the spec of the dependent package about to be built. This allows the extendee (self) to query the dependent’s state. Note that this package’s spec is available as self.spec

property spec
property stage
test()[source]
class spack.builder.BuilderMeta(name, bases, attr_dict)[source]

Bases: PhaseCallbacksMeta, ABCMeta

class spack.builder.CallbackTemporaryStage(attribute_name, callbacks)

Bases: tuple

An object of this kind is a shared global state used to collect callbacks during class definition time, and is flushed when the class object is created at the end of the class definition

Parameters:
  • attribute_name (str) – name of the attribute that will be attached to the builder

  • callbacks (list) – container used to temporarily aggregate the callbacks

attribute_name

Alias for field number 0

callbacks

Alias for field number 1

class spack.builder.InstallationPhase(name, builder)[source]

Bases: object

Manages a single phase of the installation.

This descriptor stores at creation time the name of the method it should search for execution. The method is retrieved at __get__ time, so that it can be overridden by subclasses of whatever class declared the phases.

It also provides hooks to execute arbitrary callbacks before and after the phase.

copy()[source]
execute()[source]
class spack.builder.PhaseCallbacksMeta(name, bases, attr_dict)[source]

Bases: type

Permit to register arbitrary functions during class definition and run them later, before or after a given install phase.

Each method decorated with run_before or run_after gets temporarily stored in a global shared state when a class being defined is parsed by the Python interpreter. At class definition time that temporary storage gets flushed and a list of callbacks is attached to the class being defined.

static run_after(phase, when=None)[source]

Decorator to register a function for running after a given phase.

Parameters:
  • phase (str) – phase after which the function must run.

  • when (str) – condition under which the function is run (if None, it is always run).

static run_before(phase, when=None)[source]

Decorator to register a function for running before a given phase.

Parameters:
  • phase (str) – phase before which the function must run.

  • when (str) – condition under which the function is run (if None, it is always run).

spack.builder.builder(build_system_name)[source]

Class decorator used to register the default builder for a given build-system.

Parameters:

build_system_name (str) – name of the build-system

spack.builder.buildsystem_name(pkg)[source]

Given a package object with an associated concrete spec, return the name of its build system.

Parameters:

pkg (spack.package_base.PackageBase) – package for which we want the build system name

spack.builder.create(pkg)[source]

Given a package object with an associated concrete spec, return the builder object that can install it.

Parameters:

pkg (spack.package_base.PackageBase) – package for which we want the builder

spack.builder.run_after(phase, when=None)

Decorator to register a function for running after a given phase.

Parameters:
  • phase (str) – phase after which the function must run.

  • when (str) – condition under which the function is run (if None, it is always run).

spack.builder.run_before(phase, when=None)

Decorator to register a function for running before a given phase.

Parameters:
  • phase (str) – phase before which the function must run.

  • when (str) – condition under which the function is run (if None, it is always run).

spack.caches module

Caches used by Spack to store data

spack.caches.FETCH_CACHE: FsCache | Singleton = <spack.fetch_strategy.FsCache object>

Spack’s local cache for downloaded source archives

spack.caches.MISC_CACHE: FileCache | Singleton = <spack.util.file_cache.FileCache object>

Spack’s cache for small data

class spack.caches.MirrorCache(root, skip_unstable_versions)[source]

Bases: object

store(fetcher, relative_dest)[source]

Fetch and relocate the fetcher’s target into our mirror cache.

Symlink a human readible path in our mirror to the actual storage location.

spack.caches.fetch_cache_location()[source]

Filesystem cache of downloaded archives.

This prevents Spack from repeatedly fetch the same files when building the same package different ways or multiple times.

spack.caches.misc_cache_location()[source]

The MISC_CACHE is Spack’s cache for small data.

Currently the MISC_CACHE stores indexes for virtual dependency providers and for which packages provide which tags.

spack.ci module

class spack.ci.CDashHandler(ci_cdash)[source]

Bases: object

Class for managing CDash data and processing.

args()[source]
property build_name

Returns the CDash build name.

A name will be generated if the current_spec property is set; otherwise, the value will be retrieved from the environment through the SPACK_CDASH_BUILD_NAME variable.

Returns: (str) current spec’s CDash build name.

property build_stamp

Returns the CDash build stamp.

The one defined by SPACK_CDASH_BUILD_STAMP environment variable is preferred due to the representation of timestamps; otherwise, one will be built.

Returns: (str) current CDash build stamp

copy_test_results(source, dest)[source]

Copy test results to artifacts directory.

create_buildgroup(opener, headers, url, group_name, group_type)[source]
populate_buildgroup(job_names)[source]
property project_enc
report_skipped(spec: Spec, report_dir: str, reason: str | None)[source]

Explicitly report skipping testing of a spec (e.g., it’s CI configuration identifies it as known to have broken tests or the CI installation failed).

Parameters:
  • spec – spec being tested

  • report_dir – directory where the report will be written

  • reason – reason the test is being skipped

property upload_url
class spack.ci.PushResult(success, url)

Bases: tuple

success

Alias for field number 0

url

Alias for field number 1

class spack.ci.RebuildDecision[source]

Bases: object

class spack.ci.SpackCI(ci_config, spec_labels, stages)[source]

Bases: object

Spack CI object used to generate intermediate representation used by the CI generator(s).

generate_ir()[source]

Generate the IR from the Spack CI configurations.

class spack.ci.TemporaryDirectory[source]

Bases: object

spack.ci.can_sign_binaries()[source]

Utility method to determine if this spack instance is capable of signing binary packages. This is currently only possible if the spack gpg keystore contains exactly one secret key.

spack.ci.can_verify_binaries()[source]

Utility method to determin if this spack instance is capable (at least in theory) of verifying signed binaries.

spack.ci.compute_affected_packages(rev1='HEAD^', rev2='HEAD')[source]

Determine which packages were added, removed or changed between rev1 and rev2, and return the names as a set

spack.ci.copy_files_to_artifacts(src, artifacts_dir)[source]

Copy file(s) to the given artifacts directory

Parameters:
  • src (str) – the glob-friendly path expression for the file(s) to copy

  • artifacts_dir (str) – the destination directory

spack.ci.copy_stage_logs_to_artifacts(job_spec: Spec, job_log_dir: str) None[source]

Copy selected build stage file(s) to the given artifacts directory

Looks for build logs in the stage directory of the given job_spec, and attempts to copy the files into the directory given by job_log_dir.

Parameters:
  • job_spec – spec associated with spack install log

  • job_log_dir – path into which build log should be copied

spack.ci.copy_test_logs_to_artifacts(test_stage, job_test_dir)[source]

Copy test log file(s) to the given artifacts directory

Parameters:
  • test_stage (str) – test stage path

  • job_test_dir (str) – the destination artifacts test directory

spack.ci.create_buildcache(input_spec: Spec, *, destination_mirror_urls: List[str], sign_binaries: bool = False) List[PushResult][source]

Create the buildcache at the provided mirror(s).

Parameters:
  • input_spec – Installed spec to package and push

  • destination_mirror_urls – List of urls to push to

  • sign_binaries – Whether or not to sign buildcache entry

Returns: A list of PushResults, indicating success or failure.

spack.ci.display_broken_spec_messages(base_url, hashes)[source]

Fetch the broken spec file for each of the hashes under the base_url and print a message with some details about each one.

spack.ci.download_and_extract_artifacts(url, work_dir)[source]
Look for gitlab artifacts.zip at the given url, and attempt to download

and extract the contents into the given work_dir

Parameters:
  • url (str) – Complete url to artifacts.zip file

  • work_dir (str) – Path to destination where artifacts should be extracted

spack.ci.generate_gitlab_ci_yaml(env, print_summary, output_file, prune_dag=False, check_index_only=False, run_optimizer=False, use_dependencies=False, artifacts_root=None, remote_mirror_override=None)[source]
Generate a gitlab yaml file to run a dynamic child pipeline from

the spec matrix in the active environment.

Parameters:
  • env (spack.environment.Environment) – Activated environment object which must contain a gitlab-ci section describing how to map specs to runners

  • print_summary (bool) – Should we print a summary of all the jobs in the stages in which they were placed.

  • output_file (str) – File path where generated file should be written

  • prune_dag (bool) – If True, do not generate jobs for specs already exist built on the mirror.

  • check_index_only (bool) – If True, attempt to fetch the mirror index and only use that to determine whether built specs on the mirror this mode results in faster yaml generation time). Otherwise, also check each spec directly by url (useful if there is no index or it might be out of date).

  • run_optimizer (bool) – If True, post-process the generated yaml to try try to reduce the size (attempts to collect repeated configuration and replace with definitions).)

  • use_dependencies (bool) – If true, use “dependencies” rather than “needs” (“needs” allows DAG scheduling). Useful if gitlab instance cannot be configured to handle more than a few “needs” per job.

  • artifacts_root (str) – Path where artifacts like logs, environment files (spack.yaml, spack.lock), etc should be written. GitLab requires this to be within the project directory.

  • remote_mirror_override (str) – Typically only needed when one spack.yaml is used to populate several mirrors with binaries, based on some criteria. Spack protected pipelines populate different mirrors based on branch name, facilitated by this option. DEPRECATED

spack.ci.get_change_revisions()[source]

If this is a git repo get the revisions to use when checking for changed packages and spack core modules.

spack.ci.get_job_name(spec: Spec, build_group: str = '')[source]

Given a spec and possibly a build group, return the job name. If the resulting name is longer than 255 characters, it will be truncated.

Parameters:
  • spec (spack.spec.Spec) – Spec job will build

  • build_group (str) – Name of build group this job belongs to (a CDash

  • notion)

Returns: The job name

spack.ci.get_spack_info()[source]

If spack is running from a git repo, return the most recent git log entry, otherwise, return a string containing the spack version.

spack.ci.get_spec_filter_list(env, affected_pkgs, dependent_traverse_depth=None)[source]
Given a list of package names and an active/concretized

environment, return the set of all concrete specs from the environment that could have been affected by changing the list of packages.

If a dependent_traverse_depth is given, it is used to limit upward (in the parent direction) traversal of specs of touched packages. E.g. if 1 is provided, then only direct dependents of touched package specs are traversed to produce specs that could have been affected by changing the package, while if 0 is provided, only the changed specs themselves are traversed. If None is given, upward traversal of touched package specs is done all the way to the environment roots. Providing a negative number results in no traversals at all, yielding an empty set.

Parameters:
  • env (spack.environment.Environment) – Active concrete environment

  • affected_pkgs (List[str]) – Affected package names

  • dependent_traverse_depth – Optional integer to limit dependent traversal, or None to disable the limit.

Returns:

A set of concrete specs from the active environment including those associated with affected packages, their dependencies and dependents, as well as their dependents dependencies.

spack.ci.get_stack_changed(env_path, rev1='HEAD^', rev2='HEAD')[source]

Given an environment manifest path and two revisions to compare, return whether or not the stack was changed. Returns True if the environment manifest changed between the provided revisions (or additionally if the .gitlab-ci.yml file itself changed). Returns False otherwise.

spack.ci.import_signing_key(base64_signing_key)[source]
Given Base64-encoded gpg key, decode and import it to use for

signing packages.

Parameters:
  • base64_signing_key (str) – A gpg key including the secret key, armor-exported and base64 encoded, so it can be stored in a gitlab CI variable. For an example of how to generate such a key, see:

  • https – //github.com/spack/spack-infrastructure/blob/main/gitlab-docker/files/gen-key

spack.ci.process_command(name, commands, repro_dir, run=True, exit_on_failure=True)[source]

Create a script for and run the command. Copy the script to the reproducibility directory.

Parameters:
  • name (str) – name of the command being processed

  • commands (list) – list of arguments for single command or list of lists of arguments for multiple commands. No shell escape is performed.

  • repro_dir (str) – Job reproducibility directory

  • run (bool) – Run the script and return the exit code if True

Returns: the exit code from processing the command

spack.ci.push_mirror_contents(input_spec: Spec, mirror_url, sign_binaries)[source]

Push one or more binary packages to the mirror.

Parameters:
  • input_spec (spack.spec.Spec) – Installed spec to push

  • mirror_url (str) – Base url of target mirror

  • sign_binaries (bool) – If True, spack will attempt to sign binary package before pushing.

spack.ci.read_broken_spec(broken_spec_url)[source]

Read data from broken specs file located at the url, return as a yaml object.

spack.ci.remove_other_mirrors(mirrors_to_keep, scope=None)[source]

Remove all mirrors from the given config scope, the exceptions being any listed in in mirrors_to_keep, which is a list of mirror urls.

spack.ci.reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime)[source]

Given a url to gitlab artifacts.zip from a failed ‘spack ci rebuild’ job, attempt to setup an environment in which the failure can be reproduced locally. This entails the following:

First download and extract artifacts. Then look through those artifacts to glean some information needed for the reproduer (e.g. one of the artifacts contains information about the version of spack tested by gitlab, another is the generated pipeline yaml containing details of the job like the docker image used to run it). The output of this function is a set of printed instructions for running docker and then commands to run to reproduce the build once inside the container.

spack.ci.run_standalone_tests(**kwargs)[source]

Run stand-alone tests on the current spec.

Parameters:

kwargs (dict) – dictionary of arguments used to run the tests

List of recognized keys:

  • “cdash” (CDashHandler): (optional) cdash handler instance

  • “fail_fast” (bool): (optional) terminate tests after the first failure

  • “log_file” (str): (optional) test log file name if NOT CDash reporting

  • “job_spec” (Spec): spec that was built

  • “repro_dir” (str): reproduction directory

spack.ci.setup_spack_repro_version(repro_dir, checkout_commit, merge_commit=None)[source]
Look in the local spack clone to find the checkout_commit, and if

provided, the merge_commit given as arguments. If those commits can be found locally, then clone spack and attempt to recreate a merge commit with the same parent commits as tested in gitlab. This looks something like 1) git clone repo && cd repo 2) git checkout <checkout_commit> 3) git merge <merge_commit>. If there is no merge_commit provided, then skip step (3).

Parameters:
  • repro_dir (str) – Location where spack should be cloned

  • checkout_commit (str) – SHA of PR branch commit

  • merge_commit (str) – SHA of target branch parent

Returns: True if git repo state was successfully recreated, or False

otherwise.

spack.ci.stage_spec_jobs(specs)[source]
Take a set of release specs and generate a list of “stages”, where the

jobs in any stage are dependent only on jobs in previous stages. This allows us to maximize build parallelism within the gitlab-ci framework.

Parameters:

specs (Iterable) – Specs to build

Returns: A tuple of information objects describing the specs, dependencies

and stages:

spec_labels: A dictionary mapping the spec labels (which are formatted

as pkg-name/hash-prefix) to concrete specs.

deps: A dictionary where the keys should also have appeared as keys in

the spec_labels dictionary, and the values are the set of dependencies for that spec.

stages: An ordered list of sets, each of which contains all the jobs to

built in that stage. The jobs are expressed in the same format as the keys in the spec_labels and deps objects.

spack.ci.translate_deprecated_config(config)[source]
spack.ci.write_broken_spec(url, pkg_name, stack_name, job_url, pipeline_url, spec_dict)[source]

Given a url to write to and the details of the failed job, write an entry in the broken specs list.

spack.ci_needs_workaround module

spack.ci_needs_workaround.convert_job(job_entry)[source]
spack.ci_needs_workaround.get_job_name(needs_entry)
spack.ci_needs_workaround.needs_to_dependencies(yaml)[source]

spack.ci_optimization module

spack.ci_optimization.add_extends(yaml, key)[source]

Modifies the given object “yaml” so that it includes an “extends” key whose value features “key”.

If “extends” is not in yaml, then yaml is modified such that yaml[“extends”] == key.

If yaml[“extends”] is a str, then yaml is modified such that yaml[“extends”] == [yaml[“extends”], key]

If yaml[“extends”] is a list that does not include key, then key is appended to the list.

Otherwise, yaml is left unchanged.

spack.ci_optimization.build_histogram(iterator, key)[source]

Builds a histogram of values given an iterable of mappings and a key.

For each mapping “m” with key “key” in iterator, the value m[key] is considered.

Returns a list of tuples (hash, count, proportion, value), where

  • “hash” is a sha1sum hash of the value.

  • “count” is the number of occurences of values that hash to “hash”.

  • “proportion” is the proportion of all values considered above that hash to “hash”.

  • “value” is one of the values considered above that hash to “hash”. Which value is chosen when multiple values hash to the same “hash” is undefined.

The list is sorted in descending order by count, yielding the most frequently occuring hashes first.

spack.ci_optimization.common_subobject(yaml, sub)[source]

Factor prototype object “sub” out of the values of mapping “yaml”.

Consider a modified copy of yaml, “new”, where for each key, “key” in yaml:

  • If yaml[key] matches sub, then new[key] = subkeys(yaml[key], sub).

  • Otherwise, new[key] = yaml[key].

If the above match criteria is not satisfied for any such key, then (yaml, None) is returned. The yaml object is returned unchanged.

Otherwise, each matching value in new is modified as in add_extends(new[key], common_key), and then new[common_key] is set to sub. The common_key value is chosen such that it does not match any preexisting key in new. In this case, (new, common_key) is returned.

spack.ci_optimization.matches(obj, proto)[source]

Returns True if the test object “obj” matches the prototype object “proto”.

If obj and proto are mappings, obj matches proto if (key in obj) and (obj[key] matches proto[key]) for every key in proto.

If obj and proto are sequences, obj matches proto if they are of the same length and (a matches b) for every (a,b) in zip(obj, proto).

Otherwise, obj matches proto if obj == proto.

Precondition: proto must not have any reference cycles

spack.ci_optimization.optimizer(yaml)[source]
spack.ci_optimization.print_delta(name, old, new, applied=None)[source]
spack.ci_optimization.sort_yaml_obj(obj)[source]
spack.ci_optimization.subkeys(obj, proto)[source]

Returns the test mapping “obj” after factoring out the items it has in common with the prototype mapping “proto”.

Consider a recursive merge operation, merge(a, b) on mappings a and b, that returns a mapping, m, whose keys are the union of the keys of a and b, and for every such key, “k”, its corresponding value is:

  • merge(a[key], b[key]) if a[key] and b[key] are mappings, or

  • b[key] if (key in b) and not matches(a[key], b[key]),

    or

  • a[key] otherwise

If obj and proto are mappings, the returned object is the smallest object, “a”, such that merge(a, proto) matches obj.

Otherwise, obj is returned.

spack.ci_optimization.try_optimization_pass(name, yaml, optimization_pass, *args, **kwargs)[source]

Try applying an optimization pass and return information about the result

“name” is a string describing the nature of the pass. If it is a non-empty string, summary statistics are also printed to stdout.

“yaml” is the object to apply the pass to.

“optimization_pass” is the function implementing the pass to be applied.

“args” and “kwargs” are the additional arguments to pass to optimization pass. The pass is applied as

>>> (new_yaml, *other_results) = optimization_pass(yaml, *args, **kwargs)

The pass’s results are greedily rejected if it does not modify the original yaml document, or if it produces a yaml document that serializes to a larger string.

Returns (new_yaml, yaml, applied, other_results) if applied, or (yaml, new_yaml, applied, other_results) otherwise.

spack.compiler module

class spack.compiler.Compiler(cspec, operating_system, target, paths, modules=None, alias=None, environment=None, extra_rpaths=None, enable_implicit_rpaths=None, **kwargs)[source]

Bases: object

This class encapsulates a Spack “compiler”, which includes C, C++, and Fortran compilers. Subclasses should implement support for specific compilers, their possible names, arguments, and how to identify the particular type of compiler.

PrgEnv: str | None = None
PrgEnv_compiler: str | None = None
property c11_flag
property c99_flag
cc_names: List[str] = []
property cc_pic_flag

Returns the flag used by the C compiler to produce Position Independent Code (PIC).

property cc_rpath_arg
classmethod cc_version(cc)[source]
compiler_environment()[source]
property cxx11_flag
property cxx14_flag
property cxx17_flag
property cxx98_flag
cxx_names: List[str] = []
property cxx_pic_flag

Returns the flag used by the C++ compiler to produce Position Independent Code (PIC).

property cxx_rpath_arg
classmethod cxx_version(cxx)[source]
property debug_flags
classmethod default_version(cc)[source]

Override just this to override all compiler version functions.

property disable_new_dtags
property enable_new_dtags
classmethod extract_version_from_output(output)[source]

Extracts the version from compiler’s output.

f77_names: List[str] = []
property f77_pic_flag

Returns the flag used by the F77 compiler to produce Position Independent Code (PIC).

property f77_rpath_arg
classmethod f77_version(f77)[source]
fc_names: List[str] = []
property fc_pic_flag

Returns the flag used by the FC compiler to produce Position Independent Code (PIC).

property fc_rpath_arg
classmethod fc_version(fc)[source]
get_real_version()[source]

Query the compiler for its version.

This is the “real” compiler version, regardless of what is in the compilers.yaml file, which the user can change to name their compiler.

Use the runtime environment of the compiler (modules and environment modifications) to enable the compiler to run properly on any platform.

ignore_version_errors: Sequence[int] = ()

Return values to ignore when invoking the compiler to get its version

implicit_rpaths()[source]
is_supported_on_platform()

Platform matcher for Platform objects supported by compiler

property linker_arg

Flag that need to be used to pass an argument to the linker.

property openmp_flag
property opt_flags
property prefix

Query the compiler for its install prefix. This is the install path as reported by the compiler. Note that paths for cc, cxx, etc are not enough to find the install prefix of the compiler, since the can be symlinks, wrappers, or filenames instead of absolute paths.

prefixes: List[str] = []
property real_version

Executable reported compiler version used for API-determinations

E.g. C++11 flag checks.

property required_libs

For executables created with this compiler, the compiler libraries that would be generally required to run it.

classmethod search_regexps(language)[source]
setup_custom_environment(pkg, env)[source]

Set any environment variables necessary to use the compiler.

suffixes = ['-.*']
property verbose_flag

This property should be overridden in the compiler subclass if a verbose flag is available.

If it is not overridden, it is assumed to not be supported.

verify_executables()[source]

Raise an error if any of the compiler executables is not valid.

This method confirms that for all of the compilers (cc, cxx, f77, fc) that have paths, those paths exist and are executable by the current user. Raises a CompilerAccessError if any of the non-null paths for the compiler are not accessible.

property version
version_argument = '-dumpversion'

Compiler argument that produces version information

version_regex = '(.*)'

Regex used to extract version from compiler’s output

spack.concretize module

Functions here are used to take abstract specs and make them concrete. For example, if a spec asks for a version between 1.8 and 1.9, these functions might take will take the most recent 1.9 version of the package available. Or, if the user didn’t specify a compiler for a spec, then this will assign a compiler to the spec based on defaults or user preferences.

TODO: make this customizable and allow users to configure

concretization policies.

class spack.concretize.Concretizer(abstract_spec=None)[source]

Bases: object

You can subclass this class to override some of the default concretization strategies, or you can override all of them.

adjust_target(spec)[source]

Adjusts the target microarchitecture if the compiler is too old to support the default one.

Parameters:

spec – spec to be concretized

Returns:

True if spec was modified, False otherwise

check_for_compiler_existence = None

Controls whether we check that compiler versions actually exist during concretization. Used for testing and for mirror creation

choose_virtual_or_external(spec: Spec)[source]

Given a list of candidate virtual and external packages, try to find one that is most ABI compatible.

concretize_architecture(spec)[source]

If the spec is empty provide the defaults of the platform. If the architecture is not a string type, then check if either the platform, target or operating system are concretized. If any of the fields are changed then return True. If everything is concretized (i.e the architecture attribute is a namedtuple of classes) then return False. If the target is a string type, then convert the string into a concretized architecture. If it has no architecture and the root of the DAG has an architecture, then use the root otherwise use the defaults on the platform.

concretize_compiler(spec)[source]

If the spec already has a compiler, we’re done. If not, then take the compiler used for the nearest ancestor with a compiler spec and use that. If the ancestor’s compiler is not concrete, then used the preferred compiler as specified in spackconfig.

Intuition: Use the spackconfig default if no package that depends on this one has a strict compiler requirement. Otherwise, try to build with the compiler that will be used by libraries that link to this one, to maximize compatibility.

concretize_compiler_flags(spec)[source]

The compiler flags are updated to match those of the spec whose compiler is used, defaulting to no compiler flags in the spec. Default specs set at the compiler level will still be added later.

concretize_develop(spec)[source]

Add dev_path=* variant to packages built from local source.

concretize_variants(spec)[source]

If the spec already has variants filled in, return. Otherwise, add the user preferences from packages.yaml or the default variants from the package specification.

concretize_version(spec)[source]

If the spec is already concrete, return. Otherwise take the preferred version from spackconfig, and default to the package’s version if there are no available versions.

TODO: In many cases we probably want to look for installed

versions of each package and use an installed version if we can link to it. The policy implemented here will tend to rebuild a lot of stuff becasue it will prefer a compiler in the spec to any compiler already- installed things were built with. There is likely some better policy that finds some middle ground between these two extremes.

target_from_package_preferences(spec)[source]

Returns the preferred target from the package preferences if there’s any.

Parameters:

spec – abstract spec to be concretized

exception spack.concretize.InsufficientArchitectureInfoError(spec, archs)[source]

Bases: SpackError

Raised when details on architecture cannot be collected from the system

exception spack.concretize.NoBuildError(spec)[source]

Bases: SpecError

Raised when a package is configured with the buildable option False, but no satisfactory external versions can be found

exception spack.concretize.NoCompilersForArchError(arch, available_os_targets)[source]

Bases: SpackError

exception spack.concretize.NoValidVersionError(spec)[source]

Bases: SpackError

Raised when there is no way to have a concrete version for a particular spec.

exception spack.concretize.UnavailableCompilerVersionError(compiler_spec, arch=None)[source]

Bases: SpackError

Raised when there is no available compiler that satisfies a compiler spec.

spack.concretize.concretize_specs_together(*abstract_specs, **kwargs)[source]

Given a number of specs as input, tries to concretize them together.

Parameters:
  • tests (bool or list or set) – False to run no tests, True to test all packages, or a list of package names to run tests for some

  • *abstract_specs – abstract specs to be concretized, given either as Specs or strings

Returns:

List of concretized specs

spack.concretize.disable_compiler_existence_check()[source]
spack.concretize.enable_compiler_existence_check()[source]
spack.concretize.find_spec(spec, condition, default=None)[source]

Searches the dag from spec in an intelligent order and looks for a spec that matches a condition

class spack.concretize.reverse_order(value)[source]

Bases: object

Helper for creating key functions.

This is a wrapper that inverts the sense of the natural comparisons on the object.

spack.config module

This module implements Spack’s configuration file handling.

This implements Spack’s configuration system, which handles merging multiple scopes with different levels of precedence. See the documentation on Configuration Scopes for details on how Spack’s configuration system behaves. The scopes are:

  1. default

  2. system

  3. site

  4. user

And corresponding per-platform scopes. Important functions in this module are:

get_config reads in YAML data for a particular scope and returns it. Callers can then modify the data and write it back with update_config.

When read in, Spack validates configurations with jsonschemas. The schemas are in submodules of spack.schema.

spack.config.COMMAND_LINE_SCOPES: List[str] = []

configuration scopes added on the command line set by spack.main.main()

spack.config.CONFIG: Configuration | Singleton = <spack.config.Configuration object>

This is the singleton configuration instance for Spack.

spack.config.CONFIGURATION_DEFAULTS_PATH = ('defaults', '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/latest/lib/spack/docs/_spack_root/etc/spack/defaults')

Path to the default configuration

spack.config.CONFIG_DEFAULTS = {'config': {'build_jobs': 2, 'build_stage': '$tempdir/spack-stage', 'checksum': True, 'concretizer': 'clingo', 'connect_timeout': 10, 'debug': False, 'dirty': False, 'license_dir': '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/latest/lib/spack/docs/_spack_root/etc/spack/licenses', 'verify_ssl': True}}

Hard-coded default values for some key configuration options. This ensures that Spack will still work even if config.yaml in the defaults scope is removed.

exception spack.config.ConfigError(message, long_message=None)[source]

Bases: SpackError

Superclass for all Spack config related errors.

exception spack.config.ConfigFileError(message, long_message=None)[source]

Bases: ConfigError

Issue reading or accessing a configuration file.

exception spack.config.ConfigFormatError(validation_error, data: Dict[str, Any], filename: str | None = None, line: int | None = None)[source]

Bases: ConfigError

Raised when a configuration format does not match its schema.

class spack.config.ConfigPath[source]

Bases: object

element = '(?:(?:(?:"[^"]+")|(?:\'[^\']+\'))|(?:[^:\'"]+))'
next_key_pattern = '((?:(?:(?:"[^"]+")|(?:\'[^\']+\'))|(?:[^:\'"]+))[+-]?)(?:\\:|$)'
static process(path)[source]
quoted_string = '(?:"[^"]+")|(?:\'[^\']+\')'
unquoted_string = '[^:\'"]+'
class spack.config.ConfigScope(name, path)[source]

Bases: object

This class represents a configuration scope.

A scope is one directory containing named configuration files. Each file is a config “section” (e.g., mirrors, compilers, etc.).

clear() None[source]

Empty cached config information.

get_section(section: str) Dict[str, Any] | None[source]

Returns the data associated with a given section

get_section_filename(section: str) str[source]

Returns the filename associated with a given section

property is_platform_dependent: bool

Returns true if the scope name is platform specific

exception spack.config.ConfigSectionError(message, long_message=None)[source]

Bases: ConfigError

Error for referring to a bad config section name in a configuration.

class spack.config.Configuration(*scopes: ConfigScope)[source]

Bases: object

A full Spack configuration, from a hierarchy of config files.

This class makes it easy to add a new scope on top of an existing one.

clear_caches() None[source]

Clears the caches for configuration files,

This will cause files to be re-read upon the next request.

property file_scopes: List[ConfigScope]

List of writable scopes with an associated file.

get(path: str, default: Any | None = None, scope: str | None = None) Any[source]

Get a config section or a single value from one.

Accepts a path syntax that allows us to grab nested config map entries. Getting the ‘config’ section would look like:

spack.config.get('config')

and the dirty section in the config scope would be:

spack.config.get('config:dirty')

We use : as the separator, like YAML objects.

get_config(section: str, scope: str | None = None) Dict[str, Any][source]

Get configuration settings for a section.

If scope is None or not provided, return the merged contents of all of Spack’s configuration scopes. If scope is provided, return only the configuration as specified in that scope.

This off the top-level name from the YAML section. That is, for a YAML config file that looks like this:

config:
  install_tree:
    root: $spack/opt/spack
  build_stage:
  - $tmpdir/$user/spack-stage

get_config('config') will return:

{ 'install_tree': {
      'root': '$spack/opt/spack',
  }
  'build_stage': ['$tmpdir/$user/spack-stage']
}
get_config_filename(scope: str, section: str) str[source]

For some scope and section, get the name of the configuration file.

highest_precedence_non_platform_scope() ConfigScope[source]

Non-internal non-platform scope with highest precedence

Platform-specific scopes are of the form scope/platform

highest_precedence_scope() ConfigScope[source]

Non-internal scope with highest precedence.

matching_scopes(reg_expr) List[ConfigScope][source]

List of all scopes whose names match the provided regular expression.

For example, matching_scopes(r’^command’) will return all scopes whose names begin with command.

pop_scope() ConfigScope[source]

Remove the highest precedence scope and return it.

print_section(section: str, blame: bool = False, *, scope=None) None[source]

Print a configuration to stdout.

push_scope(scope: ConfigScope) None[source]

Add a higher precedence scope to the Configuration.

remove_scope(scope_name: str) ConfigScope | None[source]

Remove scope by name; has no effect when scope_name does not exist

scopes: Dict[str, ConfigScope]
set(path: str, value: Any, scope: str | None = None) None[source]

Convenience function for setting single values in config files.

Accepts the path syntax described in get().

update_config(section: str, update_data: Dict, scope: str | None = None, force: bool = False) None[source]

Update the configuration file for a particular scope.

Overwrites contents of a section in a scope with update_data, then writes out the config file.

update_data should have the top-level section name stripped off (it will be re-added). Data itself can be a list, dict, or any other yaml-ish structure.

Configuration scopes that are still written in an old schema format will fail to update unless force is True.

Parameters:
  • section – section of the configuration to be updated

  • update_data – data to be used for the update

  • scope – scope to be updated

  • force – force the update

class spack.config.ImmutableConfigScope(name, path)[source]

Bases: ConfigScope

A configuration scope that cannot be written to.

This is used for ConfigScopes passed on the command line.

class spack.config.InternalConfigScope(name: str, data: Dict[str, Any] | None = None)[source]

Bases: ConfigScope

An internal configuration scope that is not persisted to a file.

This is for spack internal use so that command-line options and config file settings are accessed the same way, and Spack can easily override settings from files.

clear() None[source]

Empty cached config information.

get_section(section: str) Dict[str, Any] | None[source]

Just reads from an internal dictionary.

get_section_filename(section: str) str[source]

Returns the filename associated with a given section

spack.config.SCOPES_METAVAR = '{defaults,system,site,user}[/PLATFORM] or env:ENVIRONMENT'

metavar to use for commands that accept scopes this is shorter and more readable than listing all choices

spack.config.SECTION_SCHEMAS: Dict[str, Any] = {'bootstrap': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'bootstrap': {'properties': {'enable': {'type': 'boolean'}, 'root': {'type': 'string'}, 'sources': {'items': {'additionalProperties': False, 'properties': {'metadata': {'type': 'string'}, 'name': {'type': 'string'}}, 'required': ['name', 'metadata'], 'type': 'object'}, 'type': 'array'}, 'trusted': {'patternProperties': {'\\w[\\w-]*': {'type': 'boolean'}}, 'type': 'object'}}, 'type': 'object'}}, 'title': 'Spack bootstrap configuration file schema', 'type': 'object'}, 'cdash': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'cdash': {'additionalProperties': False, 'patternProperties': {'build-group': {'type': 'string'}, 'project': {'type': 'string'}, 'site': {'type': 'string'}, 'url': {'type': 'string'}}, 'required': ['build-group'], 'type': 'object'}}, 'title': 'Spack cdash configuration file schema', 'type': 'object'}, 'ci': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'ci': {'oneOf': [{'anyOf': [{'additionalProperties': False, 'properties': {'broken-specs-url': {'type': 'string'}, 'broken-tests-packages': {'items': {'type': 'string'}, 'type': 'array'}, 'enable-artifacts-buildcache': {'type': 'boolean'}, 'pipeline-gen': {'items': {'oneOf': [{'additionalProperties': False, 'properties': {'match_behavior': {'default': 'first', 'enum': ['first', 'merge'], 'type': 'string'}, 'submapping': {'items': {'additionalProperties': False, 'properties': {'build-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'build-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'match': {'items': {'type': 'string'}, 'type': 'array'}}, 'required': ['match'], 'type': 'object'}, 'type': 'array'}}, 'required': ['submapping'], 'type': 'object'}, {'oneOf': [{'additionalProperties': False, 'properties': {'noop-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'noop-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}, {'additionalProperties': False, 'properties': {'build-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'build-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}, {'additionalProperties': False, 'properties': {'copy-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'copy-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}, {'additionalProperties': False, 'properties': {'reindex-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'reindex-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}, {'additionalProperties': False, 'properties': {'signing-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'signing-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}, {'additionalProperties': False, 'properties': {'cleanup-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'cleanup-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}, {'additionalProperties': False, 'properties': {'any-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'any-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}]}]}, 'type': 'array'}, 'rebuild-index': {'type': 'boolean'}, 'target': {'default': 'gitlab', 'enum': ['gitlab'], 'type': 'string'}}, 'type': 'object'}, {'additionalProperties': False, 'properties': {'broken-specs-url': {'type': 'string'}, 'broken-tests-packages': {'items': {'type': 'string'}, 'type': 'array'}, 'pipeline-gen': {'items': {'oneOf': [{'additionalProperties': False, 'properties': {'match_behavior': {'default': 'first', 'enum': ['first', 'merge'], 'type': 'string'}, 'submapping': {'items': {'additionalProperties': False, 'properties': {'build-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'build-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'match': {'items': {'type': 'string'}, 'type': 'array'}}, 'required': ['match'], 'type': 'object'}, 'type': 'array'}}, 'required': ['submapping'], 'type': 'object'}, {'oneOf': [{'additionalProperties': False, 'properties': {'noop-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'noop-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}, {'additionalProperties': False, 'properties': {'build-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'build-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}, {'additionalProperties': False, 'properties': {'copy-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'copy-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}, {'additionalProperties': False, 'properties': {'reindex-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'reindex-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}, {'additionalProperties': False, 'properties': {'signing-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'signing-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}, {'additionalProperties': False, 'properties': {'cleanup-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'cleanup-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}, {'additionalProperties': False, 'properties': {'any-job': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}, 'any-job-remove': {'additionalProperties': True, 'properties': {'after_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'before_script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}]}]}, 'type': 'array'}, 'rebuild-index': {'type': 'boolean'}, 'target': {'default': 'gitlab', 'enum': ['gitlab'], 'type': 'string'}, 'temporary-storage-url-prefix': {'type': 'string'}}, 'type': 'object'}]}, {'anyOf': [{'additionalProperties': False, 'properties': {'after_script': {'items': {'type': 'string'}, 'type': 'array'}, 'before_script': {'items': {'type': 'string'}, 'type': 'array'}, 'bootstrap': {'items': {'anyOf': [{'type': 'string'}, {'additionalProperties': False, 'properties': {'compiler-agnostic': {'default': False, 'type': 'boolean'}, 'name': {'type': 'string'}}, 'required': ['name'], 'type': 'object'}]}, 'type': 'array'}, 'broken-specs-url': {'type': 'string'}, 'broken-tests-packages': {'items': {'type': 'string'}, 'type': 'array'}, 'enable-artifacts-buildcache': {'type': 'boolean'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'mappings': {'items': {'additionalProperties': False, 'properties': {'match': {'items': {'type': 'string'}, 'type': 'array'}, 'remove-attributes': {'additionalProperties': False, 'properties': {'tags': {'items': {'type': 'string'}, 'type': 'array'}}, 'required': ['tags'], 'type': 'object'}, 'runner-attributes': {'additionalProperties': True, 'properties': {'after_script': {'items': {'type': 'string'}, 'type': 'array'}, 'before_script': {'items': {'type': 'string'}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'type': 'string'}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'required': ['tags'], 'type': 'object'}}, 'required': ['match'], 'type': 'object'}, 'type': 'array'}, 'match_behavior': {'default': 'first', 'enum': ['first', 'merge'], 'type': 'string'}, 'rebuild-index': {'type': 'boolean'}, 'script': {'items': {'type': 'string'}, 'type': 'array'}, 'service-job-attributes': {'additionalProperties': True, 'properties': {'after_script': {'items': {'type': 'string'}, 'type': 'array'}, 'before_script': {'items': {'type': 'string'}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'type': 'string'}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'required': ['tags'], 'type': 'object'}, 'signing-job-attributes': {'additionalProperties': True, 'properties': {'after_script': {'items': {'type': 'string'}, 'type': 'array'}, 'before_script': {'items': {'type': 'string'}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'type': 'string'}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'required': ['tags'], 'type': 'object'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'required': ['mappings'], 'type': 'object'}, {'additionalProperties': False, 'properties': {'after_script': {'items': {'type': 'string'}, 'type': 'array'}, 'before_script': {'items': {'type': 'string'}, 'type': 'array'}, 'bootstrap': {'items': {'anyOf': [{'type': 'string'}, {'additionalProperties': False, 'properties': {'compiler-agnostic': {'default': False, 'type': 'boolean'}, 'name': {'type': 'string'}}, 'required': ['name'], 'type': 'object'}]}, 'type': 'array'}, 'broken-specs-url': {'type': 'string'}, 'broken-tests-packages': {'items': {'type': 'string'}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'mappings': {'items': {'additionalProperties': False, 'properties': {'match': {'items': {'type': 'string'}, 'type': 'array'}, 'remove-attributes': {'additionalProperties': False, 'properties': {'tags': {'items': {'type': 'string'}, 'type': 'array'}}, 'required': ['tags'], 'type': 'object'}, 'runner-attributes': {'additionalProperties': True, 'properties': {'after_script': {'items': {'type': 'string'}, 'type': 'array'}, 'before_script': {'items': {'type': 'string'}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'type': 'string'}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'required': ['tags'], 'type': 'object'}}, 'required': ['match'], 'type': 'object'}, 'type': 'array'}, 'match_behavior': {'default': 'first', 'enum': ['first', 'merge'], 'type': 'string'}, 'rebuild-index': {'type': 'boolean'}, 'script': {'items': {'type': 'string'}, 'type': 'array'}, 'service-job-attributes': {'additionalProperties': True, 'properties': {'after_script': {'items': {'type': 'string'}, 'type': 'array'}, 'before_script': {'items': {'type': 'string'}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'type': 'string'}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'required': ['tags'], 'type': 'object'}, 'signing-job-attributes': {'additionalProperties': True, 'properties': {'after_script': {'items': {'type': 'string'}, 'type': 'array'}, 'before_script': {'items': {'type': 'string'}, 'type': 'array'}, 'image': {'oneOf': [{'type': 'string'}, {'properties': {'entrypoint': {'items': {'type': 'string'}, 'type': 'array'}, 'name': {'type': 'string'}}, 'type': 'object'}]}, 'script': {'items': {'type': 'string'}, 'type': 'array'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'required': ['tags'], 'type': 'object'}, 'tags': {'items': {'type': 'string'}, 'type': 'array'}, 'temporary-storage-url-prefix': {'type': 'string'}, 'variables': {'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}, 'type': 'object'}}, 'required': ['mappings'], 'type': 'object'}]}]}}, 'title': 'Spack CI configuration file schema', 'type': 'object'}, 'compilers': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'compilers': {'items': {'additionalProperties': False, 'properties': {'compiler': {'additionalProperties': False, 'properties': {'alias': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'environment': {'additionalProperties': False, 'default': {}, 'properties': {'append_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'prepend_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'remove_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'set': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'unset': {'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}, 'type': 'array'}}, 'type': 'object'}, 'extra_rpaths': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'flags': {'additionalProperties': False, 'properties': {'cflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'cppflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'cxxflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'fflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'ldflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'ldlibs': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}}, 'type': 'object'}, 'implicit_rpaths': {'anyOf': [{'items': {'type': 'string'}, 'type': 'array'}, {'type': 'boolean'}]}, 'modules': {'anyOf': [{'type': 'string'}, {'type': 'null'}, {'type': 'array'}]}, 'operating_system': {'type': 'string'}, 'paths': {'additionalProperties': False, 'properties': {'cc': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'cxx': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'f77': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'fc': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}}, 'required': ['cc', 'cxx', 'f77', 'fc'], 'type': 'object'}, 'spec': {'type': 'string'}, 'target': {'type': 'string'}}, 'required': ['paths', 'spec', 'modules', 'operating_system'], 'type': 'object'}}, 'type': 'object'}, 'type': 'array'}}, 'title': 'Spack compiler configuration file schema', 'type': 'object'}, 'concretizer': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'concretizer': {'additionalProperties': False, 'properties': {'duplicates': {'properties': {'strategy': {'enum': ['none', 'minimal', 'full'], 'type': 'string'}}, 'type': 'object'}, 'enable_node_namespace': {'type': 'boolean'}, 'reuse': {'oneOf': [{'type': 'boolean'}, {'enum': ['dependencies'], 'type': 'string'}]}, 'targets': {'properties': {'granularity': {'enum': ['generic', 'microarchitectures'], 'type': 'string'}, 'host_compatible': {'type': 'boolean'}}, 'type': 'object'}, 'unify': {'oneOf': [{'type': 'boolean'}, {'enum': ['when_possible'], 'type': 'string'}]}}, 'type': 'object'}}, 'title': 'Spack concretizer configuration file schema', 'type': 'object'}, 'config': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'config': {'default': {}, 'deprecatedProperties': {'error': False, 'message': 'config:terminal_title has been replaced by install_status and is ignored', 'properties': ['terminal_title']}, 'properties': {'additional_external_search_paths': {'items': {'type': 'string'}, 'type': 'array'}, 'aliases': {'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}, 'type': 'object'}, 'allow_sgid': {'type': 'boolean'}, 'binary_index_root': {'type': 'string'}, 'binary_index_ttl': {'minimum': 0, 'type': 'integer'}, 'build_jobs': {'minimum': 1, 'type': 'integer'}, 'build_language': {'type': 'string'}, 'build_stage': {'oneOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'ccache': {'type': 'boolean'}, 'checksum': {'type': 'boolean'}, 'concretizer': {'enum': ['original', 'clingo'], 'type': 'string'}, 'connect_timeout': {'minimum': 0, 'type': 'integer'}, 'db_lock_timeout': {'minimum': 1, 'type': 'integer'}, 'debug': {'type': 'boolean'}, 'deprecated': {'type': 'boolean'}, 'develop_stage_link': {'type': 'string'}, 'dirty': {'type': 'boolean'}, 'environments_root': {'type': 'string'}, 'extensions': {'items': {'type': 'string'}, 'type': 'array'}, 'flags': {'properties': {'keep_werror': {'enum': ['all', 'specific', 'none'], 'type': 'string'}}, 'type': 'object'}, 'install_hash_length': {'minimum': 1, 'type': 'integer'}, 'install_missing_compilers': {'type': 'boolean'}, 'install_path_scheme': {'type': 'string'}, 'install_status': {'type': 'boolean'}, 'install_tree': {'anyOf': [{'properties': {'padded_length': {'oneOf': [{'minimum': 0, 'type': 'integer'}, {'type': 'boolean'}]}, 'projections': {'patternProperties': {'all|\\w[\\w-]*': {'type': 'string'}}, 'type': 'object'}, 'root': {'type': 'string'}}, 'type': 'object'}, {'type': 'string'}]}, 'license_dir': {'type': 'string'}, 'locks': {'type': 'boolean'}, 'misc_cache': {'type': 'string'}, 'package_lock_timeout': {'anyOf': [{'minimum': 1, 'type': 'integer'}, {'type': 'null'}]}, 'shared_linking': {'anyOf': [{'enum': ['rpath', 'runpath'], 'type': 'string'}, {'properties': {'bind': {'type': 'boolean'}, 'type': {'enum': ['rpath', 'runpath'], 'type': 'string'}}, 'type': 'object'}]}, 'source_cache': {'type': 'string'}, 'stage_name': {'type': 'string'}, 'suppress_gpg_warnings': {'type': 'boolean'}, 'template_dirs': {'items': {'type': 'string'}, 'type': 'array'}, 'test_stage': {'type': 'string'}, 'url_fetch_method': {'enum': ['urllib', 'curl'], 'type': 'string'}, 'verify_ssl': {'type': 'boolean'}}, 'type': 'object'}}, 'title': 'Spack core configuration file schema', 'type': 'object'}, 'definitions': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'definitions': {'default': [], 'items': {'patternProperties': {'^(?!when$)\\w*': {'default': [], 'items': {'anyOf': [{'additionalProperties': False, 'properties': {'exclude': {'items': {'type': 'string'}, 'type': 'array'}, 'matrix': {'items': {'items': {'type': 'string'}, 'type': 'array'}, 'type': 'array'}}, 'type': 'object'}, {'type': 'string'}, {'type': 'null'}]}, 'type': 'array'}}, 'properties': {'when': {'type': 'string'}}, 'type': 'object'}, 'type': 'array'}}, 'title': 'Spack definitions configuration file schema', 'type': 'object'}, 'develop': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'develop': {'additionalProperties': False, 'default': {}, 'patternProperties': {'\\w[\\w-]*': {'additionalProperties': False, 'properties': {'path': {'type': 'string'}, 'spec': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'title': 'Spack repository configuration file schema', 'type': 'object'}, 'mirrors': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'mirrors': {'additionalProperties': False, 'default': {}, 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'additionalProperties': False, 'anyOf': [{'required': ['url']}, {'required': ['fetch']}, {'required': ['pull']}], 'properties': {'access_pair': {'items': {'maxItems': 2, 'minItems': 2, 'type': ['string', 'null']}, 'type': 'array'}, 'access_token': {'type': ['string', 'null']}, 'binary': {'type': 'boolean'}, 'endpoint_url': {'type': ['string', 'null']}, 'fetch': {'anyOf': [{'type': 'string'}, {'additionalProperties': False, 'properties': {'access_pair': {'items': {'maxItems': 2, 'minItems': 2, 'type': ['string', 'null']}, 'type': 'array'}, 'access_token': {'type': ['string', 'null']}, 'endpoint_url': {'type': ['string', 'null']}, 'profile': {'type': ['string', 'null']}, 'url': {'type': 'string'}}, 'type': 'object'}]}, 'profile': {'type': ['string', 'null']}, 'push': {'anyOf': [{'type': 'string'}, {'additionalProperties': False, 'properties': {'access_pair': {'items': {'maxItems': 2, 'minItems': 2, 'type': ['string', 'null']}, 'type': 'array'}, 'access_token': {'type': ['string', 'null']}, 'endpoint_url': {'type': ['string', 'null']}, 'profile': {'type': ['string', 'null']}, 'url': {'type': 'string'}}, 'type': 'object'}]}, 'signed': {'type': 'boolean'}, 'source': {'type': 'boolean'}, 'url': {'type': 'string'}}, 'type': 'object'}]}}, 'type': 'object'}}, 'title': 'Spack mirror configuration file schema', 'type': 'object'}, 'modules': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'modules': {'additionalProperties': False, 'patternProperties': {'^(?!prefix_inspections$)\\w[\\w-]*$': {'additionalProperties': False, 'default': {}, 'properties': {'arch_folder': {'type': 'boolean'}, 'enable': {'default': [], 'items': {'enum': ['tcl', 'lmod'], 'type': 'string'}, 'type': 'array'}, 'lmod': {'allOf': [{'allOf': [{'properties': {'all': {'additionalProperties': False, 'default': {}, 'properties': {'autoload': {'enum': ['none', 'run', 'direct', 'all'], 'type': 'string'}, 'conflict': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'environment': {'additionalProperties': False, 'default': {}, 'properties': {'append_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'prepend_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'remove_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'set': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'unset': {'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}, 'type': 'array'}}, 'type': 'object'}, 'filter': {'additionalProperties': False, 'default': {}, 'properties': {'exclude_env_vars': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}, 'load': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'prerequisites': {'enum': ['none', 'run', 'direct', 'all'], 'type': 'string'}, 'suffixes': {'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}, 'type': 'object', 'validate_spec': True}, 'template': {'type': 'string'}}, 'type': 'object'}, 'defaults': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'exclude': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'exclude_implicits': {'default': False, 'type': 'boolean'}, 'hash_length': {'default': 7, 'minimum': 0, 'type': 'integer'}, 'hide_implicits': {'default': False, 'type': 'boolean'}, 'include': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'naming_scheme': {'type': 'string'}, 'projections': {'patternProperties': {'all|\\w[\\w-]*': {'type': 'string'}}, 'type': 'object'}, 'verbose': {'default': False, 'type': 'boolean'}}}, {'patternProperties': {'(?!hierarchy|core_specs|verbose|hash_length|defaults|filter_hierarchy_specs|hide|include|exclude|projections|naming_scheme|core_compilers|all)(^\\w[\\w-]*)': {'additionalProperties': False, 'default': {}, 'properties': {'autoload': {'enum': ['none', 'run', 'direct', 'all'], 'type': 'string'}, 'conflict': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'environment': {'additionalProperties': False, 'default': {}, 'properties': {'append_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'prepend_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'remove_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'set': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'unset': {'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}, 'type': 'array'}}, 'type': 'object'}, 'filter': {'additionalProperties': False, 'default': {}, 'properties': {'exclude_env_vars': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}, 'load': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'prerequisites': {'enum': ['none', 'run', 'direct', 'all'], 'type': 'string'}, 'suffixes': {'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}, 'type': 'object', 'validate_spec': True}, 'template': {'type': 'string'}}, 'type': 'object'}, '^[\\^@%+~]': {'additionalProperties': False, 'default': {}, 'properties': {'autoload': {'enum': ['none', 'run', 'direct', 'all'], 'type': 'string'}, 'conflict': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'environment': {'additionalProperties': False, 'default': {}, 'properties': {'append_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'prepend_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'remove_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'set': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'unset': {'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}, 'type': 'array'}}, 'type': 'object'}, 'filter': {'additionalProperties': False, 'default': {}, 'properties': {'exclude_env_vars': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}, 'load': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'prerequisites': {'enum': ['none', 'run', 'direct', 'all'], 'type': 'string'}, 'suffixes': {'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}, 'type': 'object', 'validate_spec': True}, 'template': {'type': 'string'}}, 'type': 'object'}}, 'validate_spec': True}], 'default': {}, 'type': 'object'}, {'properties': {'core_compilers': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'core_specs': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'filter_hierarchy_specs': {'patternProperties': {'(?!hierarchy|core_specs|verbose|hash_length|defaults|filter_hierarchy_specs|hide|include|exclude|projections|naming_scheme|core_compilers|all)(^\\w[\\w-]*)': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}, 'hierarchy': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}]}, 'prefix_inspections': {'additionalProperties': False, 'patternProperties': {'^[\\w-]*': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}, 'roots': {'properties': {'lmod': {'type': 'string'}, 'tcl': {'type': 'string'}}, 'type': 'object'}, 'tcl': {'allOf': [{'allOf': [{'properties': {'all': {'additionalProperties': False, 'default': {}, 'properties': {'autoload': {'enum': ['none', 'run', 'direct', 'all'], 'type': 'string'}, 'conflict': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'environment': {'additionalProperties': False, 'default': {}, 'properties': {'append_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'prepend_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'remove_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'set': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'unset': {'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}, 'type': 'array'}}, 'type': 'object'}, 'filter': {'additionalProperties': False, 'default': {}, 'properties': {'exclude_env_vars': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}, 'load': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'prerequisites': {'enum': ['none', 'run', 'direct', 'all'], 'type': 'string'}, 'suffixes': {'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}, 'type': 'object', 'validate_spec': True}, 'template': {'type': 'string'}}, 'type': 'object'}, 'defaults': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'exclude': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'exclude_implicits': {'default': False, 'type': 'boolean'}, 'hash_length': {'default': 7, 'minimum': 0, 'type': 'integer'}, 'hide_implicits': {'default': False, 'type': 'boolean'}, 'include': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'naming_scheme': {'type': 'string'}, 'projections': {'patternProperties': {'all|\\w[\\w-]*': {'type': 'string'}}, 'type': 'object'}, 'verbose': {'default': False, 'type': 'boolean'}}}, {'patternProperties': {'(?!hierarchy|core_specs|verbose|hash_length|defaults|filter_hierarchy_specs|hide|include|exclude|projections|naming_scheme|core_compilers|all)(^\\w[\\w-]*)': {'additionalProperties': False, 'default': {}, 'properties': {'autoload': {'enum': ['none', 'run', 'direct', 'all'], 'type': 'string'}, 'conflict': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'environment': {'additionalProperties': False, 'default': {}, 'properties': {'append_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'prepend_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'remove_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'set': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'unset': {'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}, 'type': 'array'}}, 'type': 'object'}, 'filter': {'additionalProperties': False, 'default': {}, 'properties': {'exclude_env_vars': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}, 'load': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'prerequisites': {'enum': ['none', 'run', 'direct', 'all'], 'type': 'string'}, 'suffixes': {'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}, 'type': 'object', 'validate_spec': True}, 'template': {'type': 'string'}}, 'type': 'object'}, '^[\\^@%+~]': {'additionalProperties': False, 'default': {}, 'properties': {'autoload': {'enum': ['none', 'run', 'direct', 'all'], 'type': 'string'}, 'conflict': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'environment': {'additionalProperties': False, 'default': {}, 'properties': {'append_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'prepend_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'remove_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'set': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'unset': {'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}, 'type': 'array'}}, 'type': 'object'}, 'filter': {'additionalProperties': False, 'default': {}, 'properties': {'exclude_env_vars': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}, 'load': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'prerequisites': {'enum': ['none', 'run', 'direct', 'all'], 'type': 'string'}, 'suffixes': {'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}, 'type': 'object', 'validate_spec': True}, 'template': {'type': 'string'}}, 'type': 'object'}}, 'validate_spec': True}], 'default': {}, 'type': 'object'}, {}]}, 'use_view': {'anyOf': [{'type': 'string'}, {'type': 'boolean'}]}}, 'type': 'object'}}, 'properties': {'prefix_inspections': {'additionalProperties': False, 'patternProperties': {'^[\\w-]*': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}}, 'type': 'object'}}, 'title': 'Spack module file configuration file schema', 'type': 'object'}, 'packages': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'packages': {'additionalProperties': False, 'default': {}, 'patternProperties': {'(?!^all$)(^\\w[\\w-]*)': {'additionalProperties': False, 'default': {}, 'deprecatedProperties': {'error': False, 'message': "setting 'compiler:', 'target:' or 'provider:' preferences in a package-specific section of packages.yaml is deprecated, and will be removed in v0.22.\n\n\tThese preferences will be ignored by Spack, and can be set only in the 'all' section of the same file. You can run:\n\n\t\t$ spack audit configs\n\n\tto get better diagnostics, including files:lines where the deprecated attributes are used.\n\n\tUse requirements to enforce conditions on specific packages: https://spack.readthedocs.io/en/latest/packages_yaml.html#package-requirements\n", 'properties': ['target', 'compiler', 'providers']}, 'properties': {'buildable': {'default': True, 'type': 'boolean'}, 'compiler': {}, 'conflict': {'items': {'oneOf': [{'additionalProperties': False, 'properties': {'message': {'type': 'string'}, 'spec': {'type': 'string'}, 'when': {'type': 'string'}}, 'type': 'object'}, {'type': 'string'}]}, 'type': 'array'}, 'externals': {'items': {'additionalProperties': True, 'properties': {'extra_attributes': {'additionalProperties': True, 'properties': {'environment': {'additionalProperties': False, 'default': {}, 'properties': {'append_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'prepend_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'remove_path': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'set': {'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'type': 'object'}, 'unset': {'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}, 'type': 'array'}}, 'type': 'object'}}, 'type': 'object'}, 'modules': {'items': {'type': 'string'}, 'type': 'array'}, 'prefix': {'type': 'string'}, 'spec': {'type': 'string'}}, 'required': ['spec'], 'type': 'object'}, 'type': 'array'}, 'package_attributes': {'additionalProperties': False, 'patternProperties': {'\\w+': {}}, 'type': 'object'}, 'permissions': {'additionalProperties': False, 'properties': {'group': {'type': 'string'}, 'read': {'enum': ['user', 'group', 'world'], 'type': 'string'}, 'write': {'enum': ['user', 'group', 'world'], 'type': 'string'}}, 'type': 'object'}, 'prefer': {'items': {'oneOf': [{'additionalProperties': False, 'properties': {'message': {'type': 'string'}, 'spec': {'type': 'string'}, 'when': {'type': 'string'}}, 'type': 'object'}, {'type': 'string'}]}, 'type': 'array'}, 'providers': {}, 'require': {'oneOf': [{'items': {'oneOf': [{'additionalProperties': False, 'properties': {'any_of': {'items': {'type': 'string'}, 'type': 'array'}, 'message': {'type': 'string'}, 'one_of': {'items': {'type': 'string'}, 'type': 'array'}, 'spec': {'type': 'string'}, 'when': {'type': 'string'}}, 'type': 'object'}, {'type': 'string'}]}, 'type': 'array'}, {'type': 'string'}]}, 'target': {}, 'variants': {'oneOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'version': {'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}, 'type': 'array'}}, 'type': 'object'}}, 'properties': {'all': {'additionalProperties': False, 'default': {}, 'deprecatedProperties': {'error': False, 'message': "setting version preferences in the 'all' section of packages.yaml is deprecated and will be removed in v0.22\n\n\tThese preferences will be ignored by Spack. You can set them only in package-specific sections of the same file.\n", 'properties': ['version']}, 'properties': {'buildable': {'default': True, 'type': 'boolean'}, 'compiler': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'conflict': {'items': {'oneOf': [{'additionalProperties': False, 'properties': {'message': {'type': 'string'}, 'spec': {'type': 'string'}, 'when': {'type': 'string'}}, 'type': 'object'}, {'type': 'string'}]}, 'type': 'array'}, 'package_attributes': {'additionalProperties': False, 'patternProperties': {'\\w+': {}}, 'type': 'object'}, 'permissions': {'additionalProperties': False, 'properties': {'group': {'type': 'string'}, 'read': {'enum': ['user', 'group', 'world'], 'type': 'string'}, 'write': {'enum': ['user', 'group', 'world'], 'type': 'string'}}, 'type': 'object'}, 'prefer': {'items': {'oneOf': [{'additionalProperties': False, 'properties': {'message': {'type': 'string'}, 'spec': {'type': 'string'}, 'when': {'type': 'string'}}, 'type': 'object'}, {'type': 'string'}]}, 'type': 'array'}, 'providers': {'additionalProperties': False, 'default': {}, 'patternProperties': {'\\w[\\w-]*': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}, 'require': {'oneOf': [{'items': {'oneOf': [{'additionalProperties': False, 'properties': {'any_of': {'items': {'type': 'string'}, 'type': 'array'}, 'message': {'type': 'string'}, 'one_of': {'items': {'type': 'string'}, 'type': 'array'}, 'spec': {'type': 'string'}, 'when': {'type': 'string'}}, 'type': 'object'}, {'type': 'string'}]}, 'type': 'array'}, {'type': 'string'}]}, 'target': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'variants': {'oneOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}]}, 'version': {}}, 'type': 'object'}}, 'type': 'object'}}, 'title': 'Spack package configuration file schema', 'type': 'object'}, 'repos': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'repos': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'title': 'Spack repository configuration file schema', 'type': 'object'}, 'upstreams': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'upstreams': {'default': {}, 'patternProperties': {'\\w[\\w-]*': {'additionalProperties': False, 'default': {}, 'properties': {'install_tree': {'type': 'string'}, 'modules': {'properties': {'lmod': {'type': 'string'}, 'tcl': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}}, 'title': 'Spack core configuration file schema', 'type': 'object'}, 'view': {'$schema': 'http://json-schema.org/draft-07/schema#', 'properties': {'view': {'anyOf': [{'type': 'boolean'}, {'type': 'string'}, {'patternProperties': {'\\w+': {'additionalProperties': False, 'properties': {'exclude': {'items': {'type': 'string'}, 'type': 'array'}, 'link': {'pattern': '(roots|all|run)', 'type': 'string'}, 'link_type': {'type': 'string'}, 'projections': {'patternProperties': {'all|\\w[\\w-]*': {'type': 'string'}}, 'type': 'object'}, 'root': {'type': 'string'}, 'select': {'items': {'type': 'string'}, 'type': 'array'}}, 'required': ['root']}}, 'type': 'object'}]}}, 'title': 'Spack view configuration file schema'}}

Dict from section names -> schema for that section

class spack.config.SingleFileScope(name: str, path: str, schema: Dict[str, Any], yaml_path: List[str] | None = None)[source]

Bases: ConfigScope

This class represents a configuration scope in a single YAML file.

get_section(section: str) Dict[str, Any] | None[source]

Returns the data associated with a given section

get_section_filename(section) str[source]

Returns the filename associated with a given section

property is_platform_dependent: bool

Returns true if the scope name is platform specific

spack.config.YamlConfigDict

Type used for raw YAML configuration

alias of Dict[str, Any]

spack.config.add(fullpath: str, scope: str | None = None) None[source]

Add the given configuration to the specified config scope. Add accepts a path. If you want to add from a filename, use add_from_file

spack.config.add_default_platform_scope(platform: str) None[source]
spack.config.add_from_file(filename: str, scope: str | None = None) None[source]

Add updates to a config from a filename

spack.config.change_or_add(section_name: str, find_fn: Callable[[str], bool], update_fn: Callable[[str], None]) None[source]

Change or add a subsection of config, with additional logic to select a reasonable scope where the change is applied.

Search through config scopes starting with the highest priority: the first matching a criteria (determined by find_fn) is updated; if no such config exists, find the first config scope that defines any config for the named section; if no scopes define any related config, then update the highest-priority config scope.

spack.config.collect_urls(base_url: str) list[source]

Return a list of configuration URLs.

Parameters:

base_url – URL for a configuration (yaml) file or a directory containing yaml file(s)

Returns:

List of configuration file(s) or empty list if none

spack.config.config_paths_from_entry_points() List[Tuple[str, str]][source]

Load configuration paths from entry points

A python package can register entry point metadata so that Spack can find its configuration by adding the following to the project’s pyproject.toml:

[project.entry-points."spack.config"]
baz = "baz:get_spack_config_path"

The function get_spack_config_path returns the path to the package’s spack configuration scope

spack.config.create() Configuration[source]

Singleton Configuration instance.

This constructs one instance associated with this module and returns it. It is bundled inside a function so that configuration can be initialized lazily.

spack.config.default_modify_scope(section: str = 'config') str[source]

Return the config scope that commands should modify by default.

Commands that modify configuration by default modify the highest priority scope.

Parameters:

section (bool) – Section for which to get the default scope. If this is not ‘compilers’, a general (non-platform) scope is used.

spack.config.ensure_latest_format_fn(section: str) Callable[[Dict[str, Any]], bool][source]

Return a function that takes as input a dictionary read from a configuration file and update it to the latest format.

The function returns True if there was any update, False otherwise.

Parameters:

section – section of the configuration e.g. “packages”, “config”, etc.

spack.config.fetch_remote_configs(url: str, dest_dir: str, skip_existing: bool = True) str[source]

Retrieve configuration file(s) at the specified URL.

Parameters:
  • url – URL for a configuration (yaml) file or a directory containing yaml file(s)

  • dest_dir – destination directory

  • skip_existing – Skip files that already exist in dest_dir if True; otherwise, replace those files

Returns:

Path to the corresponding file if URL is or contains a single file and it is the only file in the destination directory or the root (dest_dir) directory if multiple configuration files exist or are retrieved.

spack.config.get(path: str, default: Any | None = None, scope: str | None = None) Any[source]

Module-level wrapper for Configuration.get().

spack.config.get_mark_from_yaml_data(obj)[source]

Try to get spack.util.spack_yaml mark from YAML data.

We try the object, and if that fails we try its first member (if it’s a container).

Returns:

mark if one is found, otherwise None.

spack.config.get_valid_type(path)[source]

Returns an instance of a type that will pass validation for path.

The instance is created by calling the constructor with no arguments. If multiple types will satisfy validation for data at the configuration path given, the priority order is list, dict, str, bool, int, float.

spack.config.matched_config(cfg_path: str) List[Tuple[str, Any]][source]
spack.config.merge_yaml(dest, source, prepend=False, append=False)[source]

Merges source into dest; entries in source take precedence over dest.

This routine may modify dest and should be assigned to dest, in case dest was None to begin with, e.g.:

dest = merge_yaml(dest, source)

In the result, elements from lists from source will appear before elements of lists from dest. Likewise, when iterating over keys or items in merged OrderedDict objects, keys from source will appear before keys from dest.

Config file authors can optionally end any attribute in a dict with :: instead of :, and the key will override that of the parent instead of merging.

+: will extend the default prepend merge strategy to include string concatenation -: will change the merge strategy to append, it also includes string concatentation

spack.config.override(path_or_scope: ConfigScope | str, value: Any | None = None) Generator[Singleton | Configuration, None, None][source]

Simple way to override config settings within a context.

Parameters:
  • path_or_scope (ConfigScope or str) – scope or single option to override

  • value (object or None) – value for the single option

Temporarily push a scope on the current configuration, then remove it after the context completes. If a single option is provided, create an internal config scope for it and push/pop that scope.

spack.config.parse_spec_from_yaml_string(string: str) Spec[source]

Parse a spec from YAML and add file/line info to errors, if it’s available.

Parse a Spec from the supplied string, but also intercept any syntax errors and add file/line information for debugging using file/line annotations from the string.

Parameters:

string – a string representing a Spec from config YAML.

spack.config.process_config_path(path: str) List[str][source]

Process a path argument to config.set() that may contain overrides (‘::’ or trailing ‘:’)

Colons will be treated as static strings if inside of quotes, e.g. this:is:a:path:’value:with:colon’ will yield:

[this, is, a, path, value:with:colon]

The path may consist only of keys (e.g. for a get) or may end in a value. Keys are always strings: if a user encloses a key in quotes, the quotes should be removed. Values with quotes should be treated as strings, but without quotes, may be parsed as a different yaml object (e.g. ‘{}’ is a dict, but ‘”{}”’ is a string).

This function does not know whether the final element of the path is a key or value, so:

  • It must strip the quotes, in case it is a key (so we look for “key” and not ‘“key”’))

  • It must indicate somehow that the quotes were stripped, in case it is a value (so that we don’t process ‘”{}”’ as a YAML dict)

Therefore, all elements with quotes are stripped, and then also converted to syaml_str (if treating the final element as a value, the caller should not parse it in this case).

spack.config.raw_github_gitlab_url(url: str) str[source]

Transform a github URL to the raw form to avoid undesirable html.

Parameters:

url – url to be converted to raw form

Returns:

Raw github/gitlab url or the original url

spack.config.read_config_file(filename: str, schema: Dict[str, Any] | None = None) Dict[str, Any] | None[source]

Read a YAML configuration file.

User can provide a schema for validation. If no schema is provided, we will infer the schema from the top-level key.

spack.config.remove_yaml(dest, source)[source]

UnMerges source from dest; entries in source take precedence over dest.

This routine may modify dest and should be assigned to dest, in case dest was None to begin with, e.g.:

dest = remove_yaml(dest, source)

In the result, elements from lists from source will not appear as elements of lists from dest. Likewise, when iterating over keys or items in merged OrderedDict objects, keys from source will not appear as keys in dest.

Config file authors can optionally end any attribute in a dict with :: instead of :, and the key will remove the entire section from dest

spack.config.scopes() Dict[str, ConfigScope][source]

Convenience function to get list of configuration scopes.

spack.config.set(path: str, value: Any, scope: str | None = None) None[source]

Convenience function for setting single values in config files.

Accepts the path syntax described in get().

spack.config.update_all(section_name: str, change_fn: Callable[[str], bool]) None[source]

Change a config section, which may have details duplicated across multiple scopes.

spack.config.use_configuration(*scopes_or_paths: ConfigScope | str) Generator[Configuration, None, None][source]

Use the configuration scopes passed as arguments within the context manager.

Parameters:

*scopes_or_paths – scope objects or paths to be used

Returns:

Configuration object associated with the scopes passed as arguments

spack.config.validate(data: Dict[str, Any], schema: Dict[str, Any], filename: str | None = None) Dict[str, Any][source]

Validate data read in from a Spack YAML file.

Parameters:
  • data – data read from a Spack YAML file

  • schema – jsonschema to validate data

This leverages the line information (start_mark, end_mark) stored on Spack YAML structures.

spack.config.writable_scope_names() List[str][source]
spack.config.writable_scopes() List[ConfigScope][source]

Return list of writable scopes. Higher-priority scopes come first in the list.

spack.context module

This module provides classes used in user and build environment

class spack.context.Context(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]

Bases: Enum

Enum used to indicate the context in which an environment has to be setup: build, run or test.

BUILD = 1
RUN = 2
TEST = 3
classmethod from_string(s: str)[source]

spack.cray_manifest module

exception spack.cray_manifest.ManifestValidationError(msg, long_msg=None)[source]

Bases: SpackError

spack.cray_manifest.compiler_from_entry(entry: dict, manifest_path: str)[source]
spack.cray_manifest.default_path = '/opt/cray/pe/cpe-descriptive-manifest/'

Cray systems can store a Spack-compatible description of system packages here.

spack.cray_manifest.entries_to_specs(entries)[source]
spack.cray_manifest.read(path, apply_updates)[source]
spack.cray_manifest.spec_from_entry(entry)[source]
spack.cray_manifest.translated_compiler_name(manifest_compiler_name)[source]

When creating a Compiler object, Spack expects a name matching one of the classes in spack.compilers. Names in the Cray manifest may differ; for cases where we know the name refers to a compiler in Spack, this function translates it automatically.

This function will raise an error if there is no recorded translation and the name doesn’t match a known compiler name.

spack.database module

Spack’s installation tracking database.

The database serves two purposes:

  1. It implements a cache on top of a potentially very large Spack directory hierarchy, speeding up many operations that would otherwise require filesystem access.

  2. It will allow us to track external installations as well as lost packages and their dependencies.

Prior to the implementation of this store, a directory layout served as the authoritative database of packages in Spack. This module provides a cache and a sanity checking mechanism for what is in the filesystem.

exception spack.database.CorruptDatabaseError(message, long_message=None)[source]

Bases: SpackError

Raised when errors are found while reading the database.

spack.database.DEFAULT_INSTALL_RECORD_FIELDS = ('spec', 'ref_count', 'path', 'installed', 'explicit', 'installation_time', 'deprecated_for')

Default list of fields written for each install record

spack.database.DEFAULT_LOCK_CFG: LockConfiguration = (True, 120, None)

Default configuration for database locks

class spack.database.Database(root: str, upstream_dbs: List[Database] | None = None, is_upstream: bool = False, lock_cfg: LockConfiguration = (True, 120, None))[source]

Bases: object

add(spec_like, *args, **kwargs)[source]
all_hashes()[source]

Return dag hash of every spec in the database.

db_for_spec_hash(hash_key)[source]
deprecate(spec_like, *args, **kwargs)[source]
deprecator(spec)[source]

Return the spec that the given spec is deprecated for, or None

get_by_hash(dag_hash, default=None, installed=<built-in function any>)[source]

Look up a spec by DAG hash, or by a DAG hash prefix.

Parameters:
  • dag_hash (str) – hash (or hash prefix) to look up

  • default (object or None) – default value to return if dag_hash is not in the DB (default: None)

  • installed (bool or InstallStatus or Iterable or None) – if True, includes only installed specs in the search; if False only missing specs, and if any, all specs in database. If an InstallStatus or iterable of InstallStatus, returns specs whose install status (installed, deprecated, or missing) matches (one of) the InstallStatus. (default: any)

installed defaults to any so that we can refer to any known hash. Note that query() and query_one() differ in that they only return installed specs by default.

Returns:

a list of specs matching the hash or hash prefix

Return type:

(list)

get_by_hash_local(dag_hash, default=None, installed=<built-in function any>)[source]

Look up a spec in this DB by DAG hash, or by a DAG hash prefix.

Parameters:
  • dag_hash (str) – hash (or hash prefix) to look up

  • default (object or None) – default value to return if dag_hash is not in the DB (default: None)

  • installed (bool or InstallStatus or Iterable or None) – if True, includes only installed specs in the search; if False only missing specs, and if any, all specs in database. If an InstallStatus or iterable of InstallStatus, returns specs whose install status (installed, deprecated, or missing) matches (one of) the InstallStatus. (default: any)

installed defaults to any so that we can refer to any known hash. Note that query() and query_one() differ in that they only return installed specs by default.

Returns:

a list of specs matching the hash or hash prefix

Return type:

(list)

get_record(spec_like, *args, **kwargs)[source]
installed_extensions_for(spec_like, *args, **kwargs)[source]
installed_relatives(spec_like, *args, **kwargs)[source]
is_occupied_install_prefix(path)[source]
mark(spec_like, *args, **kwargs)[source]
missing(spec)[source]
query(*args, **kwargs)[source]

Query the Spack database including all upstream databases.

Parameters:
  • query_spec – queries iterate through specs in the database and return those that satisfy the supplied query_spec. If query_spec is any, This will match all specs in the database. If it is a spec, we’ll evaluate spec.satisfies(query_spec)

  • known (bool or None) – Specs that are “known” are those for which Spack can locate a package.py file – i.e., Spack “knows” how to install them. Specs that are unknown may represent packages that existed in a previous version of Spack, but have since either changed their name or been removed

  • installed (bool or InstallStatus or Iterable or None) – if True, includes only installed specs in the search; if False only missing specs, and if any, all specs in database. If an InstallStatus or iterable of InstallStatus, returns specs whose install status (installed, deprecated, or missing) matches (one of) the InstallStatus. (default: True)

  • explicit (bool or None) – A spec that was installed following a specific user request is marked as explicit. If instead it was pulled-in as a dependency of a user requested spec it’s considered implicit.

  • start_date (datetime.datetime or None) – filters the query discarding specs that have been installed before start_date.

  • end_date (datetime.datetime or None) – filters the query discarding specs that have been installed after end_date.

  • hashes (Container) – list or set of hashes that we can use to restrict the search

  • in_buildcache (bool or None) – Specs that are marked in this database as part of an associated binary cache are in_buildcache. All other specs are not. This field is used for querying mirror indices. Default is any.

Returns:

list of specs that match the query

query_by_spec_hash(hash_key: str, data: Dict[str, InstallRecord] | None = None) Tuple[bool, InstallRecord | None][source]

Get a spec for hash, and whether it’s installed upstream.

Returns:

(bool, optional InstallRecord): bool tells us whether

the spec is installed upstream. Its InstallRecord is also returned if it’s installed at all; otherwise None.

Return type:

(tuple)

query_local(*args, **kwargs)[source]

Query only the local Spack database.

This function doesn’t guarantee any sorting of the returned data for performance reason, since comparing specs for __lt__ may be an expensive operation.

Parameters:
  • query_spec – queries iterate through specs in the database and return those that satisfy the supplied query_spec. If query_spec is any, This will match all specs in the database. If it is a spec, we’ll evaluate spec.satisfies(query_spec)

  • known (bool or None) – Specs that are “known” are those for which Spack can locate a package.py file – i.e., Spack “knows” how to install them. Specs that are unknown may represent packages that existed in a previous version of Spack, but have since either changed their name or been removed

  • installed (bool or InstallStatus or Iterable or None) – if True, includes only installed specs in the search; if False only missing specs, and if any, all specs in database. If an InstallStatus or iterable of InstallStatus, returns specs whose install status (installed, deprecated, or missing) matches (one of) the InstallStatus. (default: True)

  • explicit (bool or None) – A spec that was installed following a specific user request is marked as explicit. If instead it was pulled-in as a dependency of a user requested spec it’s considered implicit.

  • start_date (datetime.datetime or None) – filters the query discarding specs that have been installed before start_date.

  • end_date (datetime.datetime or None) – filters the query discarding specs that have been installed after end_date.

  • hashes (Container) – list or set of hashes that we can use to restrict the search

  • in_buildcache (bool or None) – Specs that are marked in this database as part of an associated binary cache are in_buildcache. All other specs are not. This field is used for querying mirror indices. Default is any.

Returns:

list of specs that match the query

query_local_by_spec_hash(hash_key)[source]

Get a spec by hash in the local database

Returns:

InstallRecord when installed

locally, otherwise None.

Return type:

(InstallRecord or None)

query_one(query_spec, known=<built-in function any>, installed=True)[source]

Query for exactly one spec that matches the query spec.

Raises an assertion error if more than one spec matches the query. Returns None if no installed package matches.

read_transaction()[source]

Get a read lock context manager for use in a with block.

record_fields: Tuple[str, ...] = ('spec', 'ref_count', 'path', 'installed', 'explicit', 'installation_time', 'deprecated_for')

Fields written for each install record

reindex(directory_layout)[source]

Build database index from scratch based on a directory layout.

Locks the DB if it isn’t locked already.

remove(spec_like, *args, **kwargs)[source]
specs_deprecated_by(spec)[source]

Return all specs deprecated in favor of the given spec

unused_specs(root_hashes: Container[str] | None = None, deptype: int | str | List[str] | Tuple[str, ...] = 3) List[Spec][source]

Return all specs that are currently installed but not needed by root specs.

By default, roots are all explicit specs in the database. If a set of root hashes are passed in, they are instead used as the roots.

Parameters:
  • root_hashes – optional list of roots to consider when evaluating needed installations.

  • deptype – if a spec is reachable from a root via these dependency types, it is considered needed. By default only link and run dependency types are considered.

update_explicit(spec, explicit)[source]

Update the spec’s explicit state in the database.

Parameters:
  • spec (spack.spec.Spec) – the spec whose install record is being updated

  • explicit (bool) – True if the package was requested explicitly by the user, False if it was pulled in as a dependency of an explicit package.

write_transaction()[source]

Get a write lock context manager for use in a with block.

class spack.database.FailureTracker(root_dir: str | Path, default_timeout: float | None)[source]

Bases: object

Tracks installation failures.

Prefix failure marking takes the form of a byte range lock on the nth byte of a file for coordinating between concurrent parallel build processes and a persistent file, named with the full hash and containing the spec, in a subdirectory of the database to enable persistence across overlapping but separate related build processes.

The failure lock file lives alongside the install DB.

n is the sys.maxsize-bit prefix of the associated DAG hash to make the likelihood of collision very low with no cleanup required.

clear(spec: Spec, force: bool = False) None[source]

Removes any persistent and cached failure tracking for the spec.

see mark().

Parameters:
  • spec – the spec whose failure indicators are being removed

  • force – True if the failure information should be cleared when a failure lock exists for the file, or False if the failure should not be cleared (e.g., it may be associated with a concurrent build)

clear_all() None[source]

Force remove install failure tracking files.

dir

Ensure a persistent location for dealing with parallel installation failures (e.g., across near-concurrent processes).

has_failed(spec: Spec) bool[source]

Return True if the spec is marked as failed.

lock_taken(spec: Spec) bool[source]

Return True if another process has a failure lock on the spec.

mark(spec: Spec) Lock[source]

Marks a spec as failing to install.

Parameters:

spec – spec that failed to install

persistent_mark(spec: Spec) bool[source]

Determine if the spec has a persistent failure marking.

class spack.database.ForbiddenLock[source]

Bases: object

exception spack.database.ForbiddenLockError(message, long_message=None)[source]

Bases: SpackError

Raised when an upstream DB attempts to acquire a lock

class spack.database.InstallRecord(spec: Spec, path: str, installed: bool, ref_count: int = 0, explicit: bool = False, installation_time: float | None = None, deprecated_for: Spec | None = None, in_buildcache: bool = False, origin=None)[source]

Bases: object

A record represents one installation in the DB.

The record keeps track of the spec for the installation, its install path, AND whether or not it is installed. We need the installed flag in case a user either:

  1. blew away a directory, or

  2. used spack uninstall -f to get rid of it

If, in either case, the package was removed but others still depend on it, we still need to track its spec, so we don’t actually remove from the database until a spec has no installed dependents left.

Parameters:
  • spec – spec tracked by the install record

  • path – path where the spec has been installed

  • installed – whether or not the spec is currently installed

  • ref_count (int) – number of specs that depend on this one

  • explicit (bool or None) – whether or not this spec was explicitly installed, or pulled-in as a dependency of something else

  • installation_time (datetime.datetime or None) – time of the installation

classmethod from_dict(spec, dictionary)[source]
install_type_matches(installed)[source]
to_dict(include_fields=('spec', 'ref_count', 'path', 'installed', 'explicit', 'installation_time', 'deprecated_for'))[source]
class spack.database.InstallStatus[source]

Bases: str

class spack.database.InstallStatuses[source]

Bases: object

DEPRECATED = 'deprecated'
INSTALLED = 'installed'
MISSING = 'missing'
classmethod canonicalize(query_arg)[source]
exception spack.database.InvalidDatabaseVersionError(database, expected, found)[source]

Bases: SpackError

Exception raised when the database metadata is newer than current Spack.

property database_version_message
class spack.database.LockConfiguration(enable: bool, database_timeout: int | None, package_timeout: int | None)[source]

Bases: NamedTuple

Data class to configure locks in Database objects

Parameters:
  • enable – whether to enable locks or not.

  • database_timeout – timeout for the database lock

  • package_timeout – timeout for the package lock

database_timeout: int | None

Alias for field number 1

enable: bool

Alias for field number 0

package_timeout: int | None

Alias for field number 2

exception spack.database.MissingDependenciesError(message, long_message=None)[source]

Bases: SpackError

Raised when DB cannot find records for dependencies

spack.database.NO_LOCK: LockConfiguration = (False, None, None)

Configure a database to avoid using locks

spack.database.NO_TIMEOUT: LockConfiguration = (True, None, None)

Configure the database to use locks without a timeout

exception spack.database.NoSuchSpecError(spec)[source]

Bases: KeyError

Raised when a spec is not found in the database.

exception spack.database.NonConcreteSpecAddError(message, long_message=None)[source]

Bases: SpackError

Raised when attempting to add non-concrete spec to DB.

class spack.database.SpecLocker(lock_path: str | Path, default_timeout: float | None)[source]

Bases: object

Manages acquiring and releasing read or write locks on concrete specs.

clear(spec: Spec) Tuple[bool, Lock | None][source]
clear_all(clear_fn: Callable[[Lock], Any] | None = None) None[source]
has_lock(spec: Spec) bool[source]

Returns True if the spec is already managed by this spec locker

lock(spec: Spec, timeout: float | None = None) Lock[source]

Returns a lock on a concrete spec.

The lock is a byte range lock on the nth byte of a file.

The lock file is self.lock_path.

n is the sys.maxsize-bit prefix of the DAG hash. This makes likelihood of collision is very low AND it gives us readers-writer lock semantics with just a single lockfile, so no cleanup required.

raw_lock(spec: Spec, timeout: float | None = None) Lock[source]

Returns a raw lock for a Spec, but doesn’t keep track of it.

write_lock(spec: Spec) Generator[SpecLocker, None, None][source]
exception spack.database.UpstreamDatabaseLockingError(message, long_message=None)[source]

Bases: SpackError

Raised when an operation would need to lock an upstream database

spack.database.failures_lock_path(root_dir: str | Path) Path[source]

Returns the path of the failures lock file, given the root directory.

Parameters:

root_dir – root directory containing the database directory

spack.database.lock_configuration(configuration)[source]

Return a LockConfiguration from a spack.config.Configuration object.

spack.database.prefix_lock_path(root_dir: str | Path) Path[source]

Returns the path of the prefix lock file, given the root directory.

Parameters:

root_dir – root directory containing the database directory

spack.database.reader(version: StandardVersion) Type[SpecfileReaderBase][source]

spack.dependency module

Data structures that represent Spack’s dependency relationships.

class spack.dependency.Dependency(pkg: PackageBase, spec: Spec, depflag: int = 5)[source]

Bases: object

Class representing metadata for a dependency on a package.

This class differs from spack.spec.DependencySpec because it represents metadata at the Package level. spack.spec.DependencySpec is a descriptor for an actual package configuration, while Dependency is a descriptor for a package’s dependency requirements.

A dependency is a requirement for a configuration of another package that satisfies a particular spec. The dependency can have types, which determine how that package configuration is required, e.g. whether it is required for building the package, whether it needs to be linked to, or whether it is needed at runtime so that Spack can call commands from it.

A package can also depend on another package with patches. This is for cases where the maintainers of one package also maintain special patches for their dependencies. If one package depends on another with patches, a special version of that dependency with patches applied will be built for use by the dependent package. The patches are included in the new version’s spec hash to differentiate it from unpatched versions of the same package, so that unpatched versions of the dependency package can coexist with the patched version.

merge(other: Dependency)[source]

Merge constraints, deptypes, and patches of other into self.

property name: str

Get the name of the dependency package.

spack.deptypes module

Data structures that represent Spack’s edge types.

spack.deptypes.ALL: int = 15

A flag with all dependency types set

spack.deptypes.ALL_FLAGS: Tuple[int, int, int, int] = (4, 1, 2, 8)

An iterator of all flag components

spack.deptypes.ALL_TYPES: Tuple[str, ...] = ('build', 'link', 'run', 'test')

The types of dependency relationships that Spack understands.

spack.deptypes.DEFAULT: int = 5

Default dependency type if none is specified

spack.deptypes.DEFAULT_TYPES: Tuple[str, ...] = ('build', 'link')

Default dependency type if none is specified

spack.deptypes.DepFlag

Type hint for the low-level dependency input (enum.Flag is too slow)

spack.deptypes.DepType

Individual dependency types

spack.deptypes.DepTypes

Type hint for the high-level dependency input

alias of Union[str, List[str], Tuple[str, …]]

spack.deptypes.NONE: int = 0

A flag with no dependency types set

spack.deptypes.canonicalize(deptype: str | List[str] | Tuple[str, ...]) int[source]

Convert deptype user input to a DepFlag, or raise ValueError.

Parameters:

deptype – string representing dependency type, or a list/tuple of such strings. Can also be the builtin function all or the string ‘all’, which result in a tuple of all dependency types known to Spack.

spack.deptypes.flag_from_string(s: str) int[source]
spack.deptypes.flag_from_strings(deptype: Iterable[str]) int[source]

Transform an iterable of deptype strings into a flag.

spack.deptypes.flag_to_chars(depflag: int) str[source]

Create a string representing deptypes for many dependencies.

The string will be some subset of ‘blrt’, like ‘bl ‘, ‘b t’, or ‘ lr ‘ where each letter in ‘blrt’ stands for ‘build’, ‘link’, ‘run’, and ‘test’ (the dependency types).

For a single dependency, this just indicates that the dependency has the indicated deptypes. For a list of dependnecies, this shows whether ANY dpeendency in the list has the deptypes (so the deptypes are merged).

spack.deptypes.flag_to_string(x: int) str[source]
spack.deptypes.flag_to_tuple(x: int) Tuple[str, ...][source]

spack.directives module

This package contains directives that can be used within a package.

Directives are functions that can be called inside a package definition to modify the package, for example:

class OpenMpi(Package):

depends_on(“hwloc”) provides(“mpi”) …

provides and depends_on are spack directives.

The available directives are:

  • build_system

  • conflicts

  • depends_on

  • extends

  • patch

  • provides

  • resource

  • variant

  • version

  • requires

exception spack.directives.DirectiveError(message, long_message=None)[source]

Bases: SpackError

This is raised when something is wrong with a package directive.

class spack.directives.DirectiveMeta(name, bases, attr_dict)[source]

Bases: type

Flushes the directives that were temporarily stored in the staging area into the package.

static directive(dicts=None)[source]

Decorator for Spack directives.

Spack directives allow you to modify a package while it is being defined, e.g. to add version or dependency information. Directives are one of the key pieces of Spack’s package “language”, which is embedded in python.

Here’s an example directive:

@directive(dicts='versions')
version(pkg, ...):
    ...

This directive allows you write:

class Foo(Package):
    version(...)

The @directive decorator handles a couple things for you:

  1. Adds the class scope (pkg) as an initial parameter when called, like a class method would. This allows you to modify a package from within a directive, while the package is still being defined.

  2. It automatically adds a dictionary called “versions” to the package so that you can refer to pkg.versions.

The (dicts='versions') part ensures that ALL packages in Spack will have a versions attribute after they’re constructed, and that if no directive actually modified it, it will just be an empty dict.

This is just a modular way to add storage attributes to the Package class, and it’s how Spack gets information from the packages to the core.

static pop_default_args()[source]

Pop default arguments

static pop_from_context()[source]

Pop the last constraint from the context

static push_default_args(default_args)[source]

Push default arguments

static push_to_context(when_spec)[source]

Add a spec to the context constraints.

spack.directives.build_system(*values, **kwargs)[source]
spack.directives.conflicts(conflict_spec: Spec | str, when: Spec | str | bool | None = None, msg: str | None = None)[source]

Allows a package to define a conflict.

Currently, a “conflict” is a concretized configuration that is known to be non-valid. For example, a package that is known not to be buildable with intel compilers can declare:

conflicts('%intel')

To express the same constraint only when the ‘foo’ variant is activated:

conflicts('%intel', when='+foo')
Parameters:
  • conflict_spec (spack.spec.Spec) – constraint defining the known conflict

  • when (spack.spec.Spec) – optional constraint that triggers the conflict

  • msg (str) – optional user defined message

spack.directives.depends_on(spec: Spec | str, when: Spec | str | bool | None = None, type: Tuple[str, ...] | str = ('build', 'link'), patches: Callable[[PackageBase | Dependency], None] | str | List[Callable[[PackageBase | Dependency], None] | str] | None = None)[source]

Creates a dict of deps with specs defining when they apply.

Parameters:
  • spec – the package and constraints depended on

  • when – when the dependent satisfies this, it has the dependency represented by spec

  • type – str or tuple of legal Spack deptypes

  • patches – single result of patch() directive, a str to be passed to patch, or a list of these

This directive is to be used inside a Package definition to declare that the package requires other packages to be built first. @see The section “Dependency specs” in the Spack Packaging Guide.

spack.directives.extends(spec, when=None, type=('build', 'run'), patches=None)[source]

Same as depends_on, but also adds this package to the extendee list.

keyword arguments can be passed to extends() so that extension packages can pass parameters to the extendee’s extension mechanism.

spack.directives.license(license_identifier: str, checked_by: str | List[str] | None = None, when: str | bool | None = None)[source]

Add a new license directive, to specify the SPDX identifier the software is distributed under.

Parameters:
  • license_identifiers – SPDX identifier specifying the license(s) the software is distributed under.

  • checked_by – string or list of strings indicating which github user checked the license (if any).

  • when – A spec specifying when the license applies. when: A spec specifying when the license applies.

spack.directives.maintainers(*names: str)[source]

Add a new maintainer directive, to specify maintainers in a declarative way.

Parameters:

names – GitHub username for the maintainer

spack.directives.patch(url_or_filename: str, level: int = 1, when: Spec | str | bool | None = None, working_dir: str = '.', reverse: bool = False, sha256: str | None = None, archive_sha256: str | None = None) Callable[[PackageBase | Dependency], None][source]

Packages can declare patches to apply to source. You can optionally provide a when spec to indicate that a particular patch should only be applied when the package’s spec meets certain conditions (e.g. a particular version).

Parameters:
  • url_or_filename – url or relative filename of the patch

  • level – patch level (as in the patch shell command)

  • when – optional anonymous spec that specifies when to apply the patch

  • working_dir – dir to change to before applying

  • reverse – reverse the patch

  • sha256 – sha256 sum of the patch, used to verify the patch (only required for URL patches)

  • archive_sha256 – sha256 sum of the archive, if the patch is compressed (only required for compressed URL patches)

spack.directives.provides(*specs: Spec | str, when: Spec | str | bool | None = None)[source]

Allows packages to provide a virtual dependency.

If a package provides “mpi”, other packages can declare that they depend on “mpi”, and spack can use the providing package to satisfy the dependency.

Parameters:
  • *specs – virtual specs provided by this package

  • when – condition when this provides clause needs to be considered

spack.directives.requires(*requirement_specs: str, policy='one_of', when=None, msg=None)[source]

Allows a package to request a configuration to be present in all valid solutions.

For instance, a package that is known to compile only with GCC can declare:

requires(“%gcc”)

A package that requires Apple-Clang on Darwin can declare instead:

requires(“%apple-clang”, when=”platform=darwin”, msg=”Apple Clang is required on Darwin”)

Parameters:
  • requirement_specs – spec expressing the requirement

  • when – optional constraint that triggers the requirement. If None the requirement is applied unconditionally.

  • msg – optional user defined message

spack.directives.resource(**kwargs)[source]

Define an external resource to be fetched and staged when building the package. Based on the keywords present in the dictionary the appropriate FetchStrategy will be used for the resource. Resources are fetched and staged in their own folder inside spack stage area, and then moved into the stage area of the package that needs them.

List of recognized keywords:

  • ‘when’ : (optional) represents the condition upon which the resource is needed

  • ‘destination’ : (optional) path where to move the resource. This path must be relative to the main package stage area.

  • ‘placement’ : (optional) gives the possibility to fine tune how the resource is moved into the main package stage area.

spack.directives.variant(name: str, default: Any | None = None, description: str = '', values: Sequence | Callable[[Any], bool] | None = None, multi: bool | None = None, validator: Callable[[str, str, Tuple[Any, ...]], None] | None = None, when: str | bool | None = None, sticky: bool = False)[source]

Define a variant for the package.

Packager can specify a default value as well as a text description.

Parameters:
  • name – Name of the variant

  • default – Default value for the variant, if not specified otherwise the default will be False for a boolean variant and ‘nothing’ for a multi-valued variant

  • description – Description of the purpose of the variant

  • values – Either a tuple of strings containing the allowed values, or a callable accepting one value and returning True if it is valid

  • multi – If False only one value per spec is allowed for this variant

  • validator – Optional group validator to enforce additional logic. It receives the package name, the variant name and a tuple of values and should raise an instance of SpackError if the group doesn’t meet the additional constraints

  • when – Optional condition on which the variant applies

  • sticky – The variant should not be changed by the concretizer to find a valid concrete spec

Raises:

DirectiveError – If arguments passed to the directive are invalid

spack.directives.version(ver: str | int, checksum: str | None = None, *, preferred: bool | None = None, deprecated: bool | None = None, no_cache: bool | None = None, url: str | None = None, extension: str | None = None, expand: bool | None = None, fetch_options: dict | None = None, md5: str | None = None, sha1: str | None = None, sha224: str | None = None, sha256: str | None = None, sha384: str | None = None, sha512: str | None = None, git: str | None = None, commit: str | None = None, tag: str | None = None, branch: str | None = None, get_full_repo: bool | None = None, submodules: Callable[[PackageBase], str | List[str] | bool] | bool | None = None, submodules_delete: bool | None = None, svn: str | None = None, hg: str | None = None, cvs: str | None = None, revision: str | None = None, date: str | None = None)[source]

Adds a version and, if appropriate, metadata for fetching its code.

The version directives are aggregated into a versions dictionary attribute with Version keys and metadata values, where the metadata is stored as a dictionary of kwargs.

The (keyword) arguments are turned into a valid fetch strategy for code packages later. See spack.fetch_strategy.for_package_version().

spack.directory_layout module

class spack.directory_layout.DirectoryLayout(root, **kwargs)[source]

Bases: object

A directory layout is used to associate unique paths with specs. Different installations are going to want different layouts for their install, and they can use this to customize the nesting structure of spack installs. The default layout is:

  • <install root>/

    • <platform-os-target>/

      • <compiler>-<compiler version>/

        • <name>-<version>-<hash>

The hash here is a SHA-1 hash for the full DAG plus the build spec.

The installation directory projections can be modified with the projections argument.

all_deprecated_specs()[source]
all_specs()[source]
build_packages_path(spec)[source]
create_install_directory(spec)[source]
deprecated_file_path(deprecated_spec, deprecator_spec=None)[source]

Gets full path to spec file for deprecated spec

If the deprecator_spec is provided, use that. Otherwise, assume deprecated_spec is already deprecated and its prefix links to the prefix of its deprecator.

disable_upstream_check()[source]
ensure_installed(spec)[source]

Throws InconsistentInstallDirectoryError if: 1. spec prefix does not exist 2. spec prefix does not contain a spec file, or 3. We read a spec with the wrong DAG hash out of an existing install directory.

env_metadata_path(spec)[source]
property hidden_file_regexes
metadata_path(spec)[source]
path_for_spec(spec)[source]

Return absolute path from the root to a directory for the spec.

read_spec(path)[source]

Read the contents of a file and parse them as a spec

relative_path_for_spec(spec)[source]
remove_install_directory(spec, deprecated=False)[source]

Removes a prefix and any empty parent directories from the root. Raised RemoveFailedError if something goes wrong.

spec_file_path(spec)[source]

Gets full path to spec file

specs_by_hash()[source]
write_host_environment(spec)[source]

The host environment is a json file with os, kernel, and spack versioning. We use it in the case that an analysis later needs to easily access this information.

write_spec(spec, path)[source]

Write a spec out to a file.

exception spack.directory_layout.DirectoryLayoutError(message, long_msg=None)[source]

Bases: SpackError

Superclass for directory layout errors.

exception spack.directory_layout.ExtensionAlreadyInstalledError(spec, ext_spec)[source]

Bases: DirectoryLayoutError

Raised when an extension is added to a package that already has it.

exception spack.directory_layout.ExtensionConflictError(spec, ext_spec, conflict)[source]

Bases: DirectoryLayoutError

Raised when an extension is added to a package that already has it.

exception spack.directory_layout.InconsistentInstallDirectoryError(message, long_msg=None)[source]

Bases: DirectoryLayoutError

Raised when a package seems to be installed to the wrong place.

exception spack.directory_layout.InvalidDirectoryLayoutParametersError(message, long_msg=None)[source]

Bases: DirectoryLayoutError

Raised when a invalid directory layout parameters are supplied

exception spack.directory_layout.InvalidExtensionSpecError(message, long_msg=None)[source]

Bases: DirectoryLayoutError

Raised when an extension file has a bad spec in it.

exception spack.directory_layout.RemoveFailedError(installed_spec, prefix, error)[source]

Bases: DirectoryLayoutError

Raised when a DirectoryLayout cannot remove an install prefix.

exception spack.directory_layout.SpecReadError(message, long_msg=None)[source]

Bases: DirectoryLayoutError

Raised when directory layout can’t read a spec.

spack.error module

exception spack.error.FetchError(message, long_message=None)[source]

Bases: SpackError

Superclass for fetch-related errors.

exception spack.error.NoHeadersError(message, long_message=None)[source]

Bases: SpackError

Raised when package headers are requested but cannot be found

exception spack.error.NoLibrariesError(message_or_name, prefix=None)[source]

Bases: SpackError

Raised when package libraries are requested but cannot be found

exception spack.error.SpackError(message, long_message=None)[source]

Bases: Exception

This is the superclass for all Spack errors. Subclasses can be found in the modules they have to do with.

die()[source]
property long_message
print_context()[source]

Print extended debug information about this exception.

This is usually printed when the top-level Spack error handler calls die(), but it can be called separately beforehand if a lower-level error handler needs to print error context and continue without raising the exception to the top level.

exception spack.error.SpecError(message, long_message=None)[source]

Bases: SpackError

Superclass for all errors that occur while constructing specs.

exception spack.error.UnsatisfiableSpecError(provided, required, constraint_type)[source]

Bases: SpecError

Raised when a spec conflicts with package constraints.

For original concretizer, provide the requirement that was violated when raising.

exception spack.error.UnsupportedPlatformError(message)[source]

Bases: SpackError

Raised by packages when a platform is not supported

spack.error.debug = 0

at what level we should write stack traces or short error messages this is module-scoped because it needs to be set very early

spack.extensions module

Service functions and classes to implement the hooks for Spack’s command extensions.

exception spack.extensions.CommandNotFoundError(cmd_name)[source]

Bases: SpackError

Exception class thrown when a requested command is not recognized as such.

exception spack.extensions.ExtensionNamingError(path)[source]

Bases: SpackError

Exception class thrown when a configured extension does not follow the expected naming convention.

spack.extensions.ensure_extension_loaded(extension, *, path)[source]
spack.extensions.extension_name(path)[source]

Returns the name of the extension in the path passed as argument.

Parameters:

path (str) – path where the extension resides

Returns:

The extension name.

Raises:

ExtensionNamingError – if path does not match the expected format for a Spack command extension.

spack.extensions.extension_paths_from_entry_points() List[str][source]

Load extensions from a Python package’s entry points.

A python package can register entry point metadata so that Spack can find its extensions by adding the following to the project’s pyproject.toml:

[project.entry-points."spack.extensions"]
baz = "baz:get_spack_extensions"

The function get_spack_extensions returns paths to the package’s spack extensions

spack.extensions.get_command_paths()[source]

Return the list of paths where to search for command files.

spack.extensions.get_extension_paths()[source]

Return the list of canonicalized extension paths from config:extensions.

spack.extensions.get_module(cmd_name)[source]

Imports the extension module for a particular command name and returns it.

Parameters:

cmd_name (str) – name of the command for which to get a module (contains -, not _).

spack.extensions.get_template_dirs()[source]

Returns the list of directories where to search for templates in extensions.

spack.extensions.load_command_extension(command, path)[source]

Loads a command extension from the path passed as argument.

Parameters:
  • command (str) – name of the command (contains -, not _).

  • path (str) – base path of the command extension

Returns:

A valid module if found and loadable; None if not found. Module

loading exceptions are passed through.

spack.extensions.load_extension(name: str) str[source]

Loads a single extension into the ‘spack.extensions’ package.

Parameters:

name – name of the extension

spack.extensions.path_for_extension(target_name: str, *, paths: List[str]) str[source]

Return the test root dir for a given extension.

Parameters:
  • target_name (str) – name of the extension to test

  • *paths – paths where the extensions reside

Returns:

Root directory where tests should reside or None

spack.fetch_strategy module

Fetch strategies are used to download source code into a staging area in order to build it. They need to define the following methods:

  • fetch()

    This should attempt to download/check out source from somewhere.

  • check()

    Apply a checksum to the downloaded source code, e.g. for an archive. May not do anything if the fetch method was safe to begin with.

  • expand()

    Expand (e.g., an archive) downloaded file to source, with the standard stage source path as the destination directory.

  • reset()

    Restore original state of downloaded code. Used by clean commands. This may just remove the expanded source and re-expand an archive, or it may run something like git reset –hard.

  • archive()

    Archive a source directory, e.g. for creating a mirror.

class spack.fetch_strategy.BundleFetchStrategy(**kwargs)[source]

Bases: FetchStrategy

Fetch strategy associated with bundle, or no-code, packages.

Having a basic fetch strategy is a requirement for executing post-install hooks. Consequently, this class provides the API but does little more than log messages.

TODO: Remove this class by refactoring resource handling and the link between composite stages and composite fetch strategies (see #11981).

property cachable

Report False as there is no code to cache.

fetch()[source]

Simply report success – there is no code to fetch.

mirror_id()[source]

BundlePackages don’t have a mirror id.

source_id()[source]

BundlePackages don’t have a source id.

url_attr: str | None = ''

There is no associated URL keyword in version() for no-code packages but this property is required for some strategy-related functions (e.g., check_pkg_attributes).

class spack.fetch_strategy.CacheURLFetchStrategy(url=None, checksum=None, **kwargs)[source]

Bases: URLFetchStrategy

The resource associated with a cache URL may be out of date.

fetch()[source]

Fetch source code archive or repo.

Returns:

True on success, False on failure.

Return type:

bool

exception spack.fetch_strategy.ChecksumError(message, long_message=None)[source]

Bases: FetchError

Raised when archive fails to checksum.

class spack.fetch_strategy.CvsFetchStrategy(**kwargs)[source]

Bases: VCSFetchStrategy

Fetch strategy that gets source code from a CVS repository.

Use like this in a package:

version(‘name’,

cvs=’:pserver:anonymous@www.example.com:/cvsroot%module=modulename’)

Optionally, you can provide a branch and/or a date for the URL:

version(‘name’,

cvs=’:pserver:anonymous@www.example.com:/cvsroot%module=modulename’, branch=’branchname’, date=’date’)

Repositories are checked out into the standard stage source path directory.

archive(destination)[source]

Create an archive of the downloaded data for a mirror.

For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.

property cachable

Whether fetcher is capable of caching the resource it retrieves.

This generally is determined by whether the resource is identifiably associated with a specific package version.

Returns:

True if can cache, False otherwise.

Return type:

bool

property cvs
fetch()[source]

Fetch source code archive or repo.

Returns:

True on success, False on failure.

Return type:

bool

mirror_id()[source]

This is a unique ID for a source that is intended to help identify reuse of resources across packages.

It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.

optional_attrs: List[str] = ['branch', 'date']
reset()[source]

Revert to freshly downloaded state.

For archive files, this may just re-expand the archive.

source_id()[source]

A unique ID for the source.

It is intended that a human could easily generate this themselves using the information available to them in the Spack package.

The returned value is added to the content which determines the full hash for a package using str().

url_attr: str | None = 'cvs'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

exception spack.fetch_strategy.ExtrapolationError(message, long_message=None)[source]

Bases: FetchError

Raised when we can’t extrapolate a version for a package.

exception spack.fetch_strategy.FailedDownloadError(url, msg='')[source]

Bases: FetchError

Raised when a download fails.

class spack.fetch_strategy.FetchAndVerifyExpandedFile(url, archive_sha256: str, expanded_sha256: str)[source]

Bases: URLFetchStrategy

Fetch strategy that verifies the content digest during fetching, as well as after expanding it.

expand()[source]

Verify checksum after expanding the archive.

class spack.fetch_strategy.FetchStrategy(**kwargs)[source]

Bases: object

Superclass of all fetch strategies.

archive(destination)[source]

Create an archive of the downloaded data for a mirror.

For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.

property cachable

Whether fetcher is capable of caching the resource it retrieves.

This generally is determined by whether the resource is identifiably associated with a specific package version.

Returns:

True if can cache, False otherwise.

Return type:

bool

check()[source]

Checksum the archive fetched by this FetchStrategy.

expand()[source]

Expand the downloaded archive into the stage source path.

fetch()[source]

Fetch source code archive or repo.

Returns:

True on success, False on failure.

Return type:

bool

classmethod matches(args)[source]

Predicate that matches fetch strategies to arguments of the version directive.

Parameters:

args – arguments of the version directive

mirror_id()[source]

This is a unique ID for a source that is intended to help identify reuse of resources across packages.

It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.

optional_attrs: List[str] = []
reset()[source]

Revert to freshly downloaded state.

For archive files, this may just re-expand the archive.

set_package(package)[source]
source_id()[source]

A unique ID for the source.

It is intended that a human could easily generate this themselves using the information available to them in the Spack package.

The returned value is added to the content which determines the full hash for a package using str().

url_attr: str | None = None

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

exception spack.fetch_strategy.FetcherConflict(message, long_message=None)[source]

Bases: FetchError

Raised for packages with invalid fetch attributes.

class spack.fetch_strategy.FsCache(root)[source]

Bases: object

destroy()[source]
fetcher(target_path, digest, **kwargs)[source]
store(fetcher, relative_dest)[source]
class spack.fetch_strategy.GCSFetchStrategy(*args, **kwargs)[source]

Bases: URLFetchStrategy

FetchStrategy that pulls from a GCS bucket.

fetch()[source]

Fetch source code archive or repo.

Returns:

True on success, False on failure.

Return type:

bool

url_attr: str | None = 'gs'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

class spack.fetch_strategy.GitFetchStrategy(**kwargs)[source]

Bases: VCSFetchStrategy

Fetch strategy that gets source code from a git repository. Use like this in a package:

version(‘name’, git=’https://github.com/project/repo.git’)

Optionally, you can provide a branch, or commit to check out, e.g.:

version(‘1.1’, git=’https://github.com/project/repo.git’, tag=’v1.1’)

You can use these three optional attributes in addition to git:

  • branch: Particular branch to build from (default is the

    repository’s default branch)

  • tag: Particular tag to check out

  • commit: Particular commit hash in the repo

Repositories are cloned into the standard stage source path directory.

archive(destination)[source]

Create an archive of the downloaded data for a mirror.

For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.

property cachable

Whether fetcher is capable of caching the resource it retrieves.

This generally is determined by whether the resource is identifiably associated with a specific package version.

Returns:

True if can cache, False otherwise.

Return type:

bool

clone(dest=None, commit=None, branch=None, tag=None, bare=False)[source]

Clone a repository to a path.

This method handles cloning from git, but does not require a stage.

Parameters:
  • dest (str or None) – The path into which the code is cloned. If None, requires a stage and uses the stage’s source path.

  • commit (str or None) – A commit to fetch from the remote. Only one of commit, branch, and tag may be non-None.

  • branch (str or None) – A branch to fetch from the remote.

  • tag (str or None) – A tag to fetch from the remote.

  • bare (bool) – Execute a “bare” git clone (–bare option to git)

fetch()[source]

Fetch source code archive or repo.

Returns:

True on success, False on failure.

Return type:

bool

property git
property git_version
git_version_re = 'git version (\\S+)'
mirror_id()[source]

This is a unique ID for a source that is intended to help identify reuse of resources across packages.

It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.

optional_attrs: List[str] = ['tag', 'branch', 'commit', 'submodules', 'get_full_repo', 'submodules_delete']
protocol_supports_shallow_clone()[source]

Shallow clone operations (–depth #) are not supported by the basic HTTP protocol or by no-protocol file specifications. Use (e.g.) https:// or file:// instead.

reset()[source]

Revert to freshly downloaded state.

For archive files, this may just re-expand the archive.

source_id()[source]

A unique ID for the source.

It is intended that a human could easily generate this themselves using the information available to them in the Spack package.

The returned value is added to the content which determines the full hash for a package using str().

url_attr: str | None = 'git'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

static version_from_git(git_exe)[source]

Given a git executable, return the Version (this will fail if the output cannot be parsed into a valid Version).

class spack.fetch_strategy.GoFetchStrategy(**kwargs)[source]

Bases: VCSFetchStrategy

Fetch strategy that employs the go get infrastructure.

Use like this in a package:

version(‘name’,

go=’github.com/monochromegane/the_platinum_searcher/…’)

Go get does not natively support versions, they can be faked with git.

The fetched source will be moved to the standard stage sourcepath directory during the expand step.

archive(destination)[source]

Create an archive of the downloaded data for a mirror.

For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.

expand()[source]

Expand the downloaded archive into the stage source path.

fetch()[source]

Fetch source code archive or repo.

Returns:

True on success, False on failure.

Return type:

bool

property go
property go_version
reset()[source]

Revert to freshly downloaded state.

For archive files, this may just re-expand the archive.

url_attr: str | None = 'go'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

class spack.fetch_strategy.HgFetchStrategy(**kwargs)[source]

Bases: VCSFetchStrategy

Fetch strategy that gets source code from a Mercurial repository. Use like this in a package:

version(‘name’, hg=’https://jay.grs.rwth-aachen.de/hg/lwm2’)

Optionally, you can provide a branch, or revision to check out, e.g.:

version(‘torus’,

hg=’https://jay.grs.rwth-aachen.de/hg/lwm2’, branch=’torus’)

You can use the optional ‘revision’ attribute to check out a branch, tag, or particular revision in hg. To prevent non-reproducible builds, using a moving target like a branch is discouraged.

  • revision: Particular revision, branch, or tag.

Repositories are cloned into the standard stage source path directory.

archive(destination)[source]

Create an archive of the downloaded data for a mirror.

For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.

property cachable

Whether fetcher is capable of caching the resource it retrieves.

This generally is determined by whether the resource is identifiably associated with a specific package version.

Returns:

True if can cache, False otherwise.

Return type:

bool

fetch()[source]

Fetch source code archive or repo.

Returns:

True on success, False on failure.

Return type:

bool

property hg

Returns: Executable: the hg executable

mirror_id()[source]

This is a unique ID for a source that is intended to help identify reuse of resources across packages.

It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.

optional_attrs: List[str] = ['revision']
reset()[source]

Revert to freshly downloaded state.

For archive files, this may just re-expand the archive.

source_id()[source]

A unique ID for the source.

It is intended that a human could easily generate this themselves using the information available to them in the Spack package.

The returned value is added to the content which determines the full hash for a package using str().

url_attr: str | None = 'hg'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

exception spack.fetch_strategy.InvalidArgsError(pkg=None, version=None, **args)[source]

Bases: FetchError

Raised when a version can’t be deduced from a set of arguments.

exception spack.fetch_strategy.NoArchiveFileError(message, long_message=None)[source]

Bases: FetchError

Raised when an archive file is expected but none exists.

exception spack.fetch_strategy.NoCacheError(message, long_message=None)[source]

Bases: FetchError

Raised when there is no cached archive for a package.

exception spack.fetch_strategy.NoDigestError(message, long_message=None)[source]

Bases: FetchError

Raised after attempt to checksum when URL has no digest.

exception spack.fetch_strategy.NoStageError(method)[source]

Bases: FetchError

Raised when fetch operations are called before set_stage().

class spack.fetch_strategy.OCIRegistryFetchStrategy(url=None, checksum=None, **kwargs)[source]

Bases: URLFetchStrategy

fetch()[source]

Fetch source code archive or repo.

Returns:

True on success, False on failure.

Return type:

bool

class spack.fetch_strategy.S3FetchStrategy(*args, **kwargs)[source]

Bases: URLFetchStrategy

FetchStrategy that pulls from an S3 bucket.

fetch()[source]

Fetch source code archive or repo.

Returns:

True on success, False on failure.

Return type:

bool

url_attr: str | None = 's3'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

class spack.fetch_strategy.SvnFetchStrategy(**kwargs)[source]

Bases: VCSFetchStrategy

Fetch strategy that gets source code from a subversion repository.

Use like this in a package:

version(‘name’, svn=’http://www.example.com/svn/trunk’)

Optionally, you can provide a revision for the URL:

version(‘name’, svn=’http://www.example.com/svn/trunk’,

revision=’1641’)

Repositories are checked out into the standard stage source path directory.

archive(destination)[source]

Create an archive of the downloaded data for a mirror.

For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.

property cachable

Whether fetcher is capable of caching the resource it retrieves.

This generally is determined by whether the resource is identifiably associated with a specific package version.

Returns:

True if can cache, False otherwise.

Return type:

bool

fetch()[source]

Fetch source code archive or repo.

Returns:

True on success, False on failure.

Return type:

bool

mirror_id()[source]

This is a unique ID for a source that is intended to help identify reuse of resources across packages.

It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.

optional_attrs: List[str] = ['revision']
reset()[source]

Revert to freshly downloaded state.

For archive files, this may just re-expand the archive.

source_id()[source]

A unique ID for the source.

It is intended that a human could easily generate this themselves using the information available to them in the Spack package.

The returned value is added to the content which determines the full hash for a package using str().

property svn
url_attr: str | None = 'svn'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

class spack.fetch_strategy.URLFetchStrategy(url=None, checksum=None, **kwargs)[source]

Bases: FetchStrategy

URLFetchStrategy pulls source code from a URL for an archive, check the archive against a checksum, and decompresses the archive.

The destination for the resulting file(s) is the standard stage path.

archive(destination)[source]

Just moves this archive to the destination.

property archive_file

Path to the source archive within this stage directory.

property cachable

Whether fetcher is capable of caching the resource it retrieves.

This generally is determined by whether the resource is identifiably associated with a specific package version.

Returns:

True if can cache, False otherwise.

Return type:

bool

property candidate_urls
check()[source]

Check the downloaded archive against a checksum digest. No-op if this stage checks code out of a repository.

property curl
expand()[source]

Expand the downloaded archive into the stage source path.

fetch()[source]

Fetch source code archive or repo.

Returns:

True on success, False on failure.

Return type:

bool

mirror_id()[source]

This is a unique ID for a source that is intended to help identify reuse of resources across packages.

It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.

optional_attrs: List[str] = ['sha256', 'md5', 'sha1', 'sha224', 'sha384', 'sha512', 'checksum']
reset()[source]

Removes the source path if it exists, then re-expands the archive.

source_id()[source]

A unique ID for the source.

It is intended that a human could easily generate this themselves using the information available to them in the Spack package.

The returned value is added to the content which determines the full hash for a package using str().

url_attr: str | None = 'url'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

class spack.fetch_strategy.VCSFetchStrategy(**kwargs)[source]

Bases: FetchStrategy

Superclass for version control system fetch strategies.

Like all fetchers, VCS fetchers are identified by the attributes passed to the version directive. The optional_attrs for a VCS fetch strategy represent types of revisions, e.g. tags, branches, commits, etc.

The required attributes (git, svn, etc.) are used to specify the URL and to distinguish a VCS fetch strategy from a URL fetch strategy.

archive(destination, *, exclude: str | None = None)[source]

Create an archive of the downloaded data for a mirror.

For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.

check()[source]

Checksum the archive fetched by this FetchStrategy.

expand()[source]

Expand the downloaded archive into the stage source path.

spack.fetch_strategy.all_strategies = [<class 'spack.fetch_strategy.BundleFetchStrategy'>, <class 'spack.fetch_strategy.URLFetchStrategy'>, <class 'spack.fetch_strategy.CacheURLFetchStrategy'>, <class 'spack.fetch_strategy.GoFetchStrategy'>, <class 'spack.fetch_strategy.GitFetchStrategy'>, <class 'spack.fetch_strategy.CvsFetchStrategy'>, <class 'spack.fetch_strategy.SvnFetchStrategy'>, <class 'spack.fetch_strategy.HgFetchStrategy'>, <class 'spack.fetch_strategy.S3FetchStrategy'>, <class 'spack.fetch_strategy.GCSFetchStrategy'>, <class 'spack.fetch_strategy.FetchAndVerifyExpandedFile'>]

List of all fetch strategies, created by FetchStrategy metaclass.

spack.fetch_strategy.check_pkg_attributes(pkg)[source]

Find ambiguous top-level fetch attributes in a package.

Currently this only ensures that two or more VCS fetch strategies are not specified at once.

spack.fetch_strategy.fetcher(cls)[source]

Decorator used to register fetch strategies.

spack.fetch_strategy.for_package_version(pkg, version=None)[source]

Determine a fetch strategy based on the arguments supplied to version() in the package description.

spack.fetch_strategy.from_kwargs(**kwargs)[source]

Construct an appropriate FetchStrategy from the given keyword arguments.

Parameters:

**kwargs – dictionary of keyword arguments, e.g. from a version() directive in a package.

Returns:

The fetch strategy that matches the args, based

on attribute names (e.g., git, hg, etc.)

Return type:

Callable

Raises:

spack.error.FetchError – If no fetch_strategy matches the args.

spack.fetch_strategy.from_list_url(pkg)[source]

If a package provides a URL which lists URLs for resources by version, this can can create a fetcher for a URL discovered for the specified package’s version.

spack.fetch_strategy.from_url(url)[source]

Given a URL, find an appropriate fetch strategy for it. Currently just gives you a URLFetchStrategy that uses curl.

TODO: make this return appropriate fetch strategies for other

types of URLs.

spack.fetch_strategy.from_url_scheme(url, *args, **kwargs)[source]

Finds a suitable FetchStrategy by matching its url_attr with the scheme in the given url.

spack.fetch_strategy.stable_target(fetcher)[source]

Returns whether the fetcher target is expected to have a stable checksum. This is only true if the target is a preexisting archive file.

spack.fetch_strategy.verify_checksum(file, digest)[source]
spack.fetch_strategy.warn_content_type_mismatch(subject, content_type='HTML')[source]

spack.filesystem_view module

class spack.filesystem_view.FilesystemView(root, layout, **kwargs)[source]

Bases: object

Governs a filesystem view that is located at certain root-directory.

Packages are linked from their install directories into a common file hierachy.

In distributed filesystems, loading each installed package seperately can lead to slow-downs due to too many directories being traversed. This can be circumvented by loading all needed modules into a common directory structure.

add_specs(*specs, **kwargs)[source]

Add given specs to view.

Should accept with_dependencies as keyword argument (default True) to indicate wether or not dependencies should be activated as well.

Should except an exclude keyword argument containing a list of regexps that filter out matching spec names.

This method should make use of activate_standalone.

add_standalone(spec)[source]

Add (link) a standalone package into this view.

check_added(spec)[source]

Check if the given concrete spec is active in this view.

get_all_specs()[source]

Get all specs currently active in this view.

get_projection_for_spec(spec)[source]

Get the projection in this view for a spec.

get_spec(spec)[source]

Return the actual spec linked in this view (i.e. do not look it up in the database by name).

spec can be a name or a spec from which the name is extracted.

As there can only be a single version active for any spec the name is enough to identify the spec in the view.

If no spec is present, returns None.

print_status(*specs, **kwargs)[source]
Print a short summary about the given specs, detailing whether..
  • ..they are active in the view.

  • ..they are active but the activated version differs.

  • ..they are not activte in the view.

Takes with_dependencies keyword argument so that the status of dependencies is printed as well.

remove_specs(*specs, **kwargs)[source]

Removes given specs from view.

Should accept with_dependencies as keyword argument (default True) to indicate wether or not dependencies should be deactivated as well.

Should accept with_dependents as keyword argument (default True) to indicate wether or not dependents on the deactivated specs should be removed as well.

Should except an exclude keyword argument containing a list of regexps that filter out matching spec names.

This method should make use of deactivate_standalone.

remove_standalone(spec)[source]

Remove (unlink) a standalone package from this view.

class spack.filesystem_view.YamlFilesystemView(root, layout, **kwargs)[source]

Bases: FilesystemView

Filesystem view to work with a yaml based directory layout.

add_specs(*specs, **kwargs)[source]

Add given specs to view.

Should accept with_dependencies as keyword argument (default True) to indicate wether or not dependencies should be activated as well.

Should except an exclude keyword argument containing a list of regexps that filter out matching spec names.

This method should make use of activate_standalone.

add_standalone(spec)[source]

Add (link) a standalone package into this view.

check_added(spec)[source]

Check if the given concrete spec is active in this view.

clean()[source]
get_all_specs()[source]

Get all specs currently active in this view.

get_conflicts(*specs)[source]

Return list of tuples (<spec>, <spec in view>) where the spec active in the view differs from the one to be activated.

get_path_meta_folder(spec)[source]

Get path to meta folder for either spec or spec name.

get_projection_for_spec(spec)[source]

Return the projection for a spec in this view.

Relies on the ordering of projections to avoid ambiguity.

get_spec(spec)[source]

Return the actual spec linked in this view (i.e. do not look it up in the database by name).

spec can be a name or a spec from which the name is extracted.

As there can only be a single version active for any spec the name is enough to identify the spec in the view.

If no spec is present, returns None.

merge(spec, ignore=None)[source]
print_conflict(spec_active, spec_specified, level='error')[source]

Singular print function for spec conflicts.

print_status(*specs, **kwargs)[source]
Print a short summary about the given specs, detailing whether..
  • ..they are active in the view.

  • ..they are active but the activated version differs.

  • ..they are not activte in the view.

Takes with_dependencies keyword argument so that the status of dependencies is printed as well.

read_projections()[source]
remove_files(files)[source]
remove_specs(*specs, **kwargs)[source]

Removes given specs from view.

Should accept with_dependencies as keyword argument (default True) to indicate wether or not dependencies should be deactivated as well.

Should accept with_dependents as keyword argument (default True) to indicate wether or not dependents on the deactivated specs should be removed as well.

Should except an exclude keyword argument containing a list of regexps that filter out matching spec names.

This method should make use of deactivate_standalone.

remove_standalone(spec)[source]

Remove (unlink) a standalone package from this view.

unmerge(spec, ignore=None)[source]
write_projections()[source]

spack.graph module

Functions for graphing DAGs of dependencies.

This file contains code for graphing DAGs of software packages (i.e. Spack specs). There are two main functions you probably care about:

graph_ascii() will output a colored graph of a spec in ascii format, kind of like the graph git shows with “git log –graph”, e.g.:

o  mpileaks
|\
| |\
| o |  callpath
|/| |
| |\|
| |\ \
| | |\ \
| | | | o  adept-utils
| |_|_|/|
|/| | | |
o | | | |  mpi
 / / / /
| | o |  dyninst
| |/| |
|/|/| |
| | |/
| o |  libdwarf
|/ /
o |  libelf
 /
o  boost

graph_dot() will output a graph of a spec (or multiple specs) in dot format.

class spack.graph.AsciiGraph[source]

Bases: object

write(spec, color=None, out=None)[source]

Write out an ascii graph of the provided spec.

Arguments: spec – spec to graph. This only handles one spec at a time.

Optional arguments:

out – file object to write out to (default is sys.stdout)

color – whether to write in color. Default is to autodetect

based on output file.

class spack.graph.DAGWithDependencyTypes[source]

Bases: DotGraphBuilder

DOT graph with link,run nodes grouped together and edges colored according to the dependency types.

edge_entry(edge)[source]

Return a tuple of (parent_id, child_id, edge_options)

node_entry(node)[source]

Return a tuple of (node_id, node_options)

visit(edge)[source]

Visit an edge and builds up entries to render the graph

class spack.graph.DotGraphBuilder[source]

Bases: object

Visit edges of a graph a build DOT options for nodes and edges

context()[source]

Return the context to be used to render the DOT graph template

edge_entry(edge: DependencySpec) Tuple[str, str, str][source]

Return a tuple of (parent_id, child_id, edge_options)

node_entry(node: Spec) Tuple[str, str][source]

Return a tuple of (node_id, node_options)

render() str[source]

Return a string with the output in DOT format

visit(edge: DependencySpec)[source]

Visit an edge and builds up entries to render the graph

class spack.graph.SimpleDAG[source]

Bases: DotGraphBuilder

Simple DOT graph, with nodes colored uniformly and edges without properties

edge_entry(edge)[source]

Return a tuple of (parent_id, child_id, edge_options)

node_entry(node)[source]

Return a tuple of (node_id, node_options)

class spack.graph.StaticDag[source]

Bases: DotGraphBuilder

DOT graph for possible dependencies

edge_entry(edge)[source]

Return a tuple of (parent_id, child_id, edge_options)

node_entry(node)[source]

Return a tuple of (node_id, node_options)

spack.graph.find(seq, predicate)[source]

Find index in seq for which predicate is True.

Searches the sequence and returns the index of the element for which the predicate evaluates to True. Returns -1 if the predicate does not evaluate to True for any element in seq.

spack.graph.graph_ascii(spec, node='o', out=None, debug=False, indent=0, color=None, depflag: int = 15)[source]
spack.graph.graph_dot(specs: List[Spec], builder: DotGraphBuilder | None = None, depflag: int = 15, out: TextIO | None = None)[source]

DOT graph of the concrete specs passed as input.

Parameters:
  • specs – specs to be represented

  • builder – builder to use to render the graph

  • depflag – dependency types to consider

  • out – optional output stream. If None sys.stdout is used

spack.graph.static_graph_dot(specs: List[Spec], depflag: int = 15, out: TextIO | None = None)[source]

Static DOT graph with edges to all possible dependencies.

Parameters:
  • specs – abstract specs to be represented

  • depflag – dependency types to consider

  • out – optional output stream. If None sys.stdout is used

spack.hash_types module

Definitions that control how Spack creates Spec hashes.

class spack.hash_types.SpecHashDescriptor(depflag: int, package_hash, name, override=None)[source]

Bases: object

This class defines how hashes are generated on Spec objects.

Spec hashes in Spack are generated from a serialized (e.g., with YAML) representation of the Spec graph. The representation may only include certain dependency types, and it may optionally include a canonicalized hash of the package.py for each node in the graph.

We currently use different hashes for different use cases.

property attr

Private attribute stored on spec

spack.hash_types.dag_hash = <spack.hash_types.SpecHashDescriptor object>

Spack’s deployment hash. Includes all inputs that can affect how a package is built.

spack.hash_types.package_hash = <spack.hash_types.SpecHashDescriptor object>

Package hash used as part of dag hash

spack.hash_types.process_hash = <spack.hash_types.SpecHashDescriptor object>

Hash descriptor used only to transfer a DAG, as is, across processes

spack.install_test module

class spack.install_test.PackageTest(pkg: Pb)[source]

Bases: object

The class that manages stand-alone (post-install) package tests.

add_failure(exception: Exception, msg: str)[source]

Add the failure details to the current list.

archive_install_test_log(dest_dir: str)[source]
property archived_install_test_log: str
property logger: nixlog | winlog | None

The current logger or, if none, sets to one.

parts() int[source]

The total number of (checked) test parts.

phase_tests(builder: Builder, phase_name: str, method_names: List[str])[source]

Execute the builder’s package phase-time tests.

Parameters:
  • builder – builder for package being tested

  • phase_name – the name of the build-time phase (e.g., build, install)

  • method_names – phase-specific callback method names

print_log_path()[source]

Print the test log file path.

ran_tests() bool[source]

True if ran tests, False otherwise.

stand_alone_tests(kwargs)[source]

Run the package’s stand-alone tests.

Parameters:

kwargs (dict) – arguments to be used by the test process

status(name: str, status: TestStatus, msg: str | None = None)[source]

Track and print the test status for the test part name.

summarize()[source]

Collect test results summary lines for this spec.

test_logger(verbose: bool = False, externals: bool = False)[source]

Context manager for setting up the test logger

Parameters:
  • verbose – Display verbose output, including echoing to stdout, otherwise suppress it

  • externalsTrue for performing tests if external package, False to skip them

write_tested_status()[source]

Write the overall status to the tested file.

If there any test part failures, then the tests failed. If all test parts are skipped, then the tests were skipped. If any tests passed then the tests passed; otherwise, there were not tests executed.

exception spack.install_test.SkipTest[source]

Bases: Exception

Raised when a test (part) is being skipped.

exception spack.install_test.TestFailure(failures: List[Tuple[BaseException, str]])[source]

Bases: SpackError

Raised when package tests have failed for an installation.

spack.install_test.TestFailureType

Stand-alone test failure info type

alias of Tuple[BaseException, str]

class spack.install_test.TestStatus(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]

Bases: Enum

Names of different stand-alone test states.

FAILED = 1
NO_TESTS = -1
PASSED = 2
SKIPPED = 0
lower()[source]
class spack.install_test.TestSuite(specs, alias=None)[source]

Bases: object

The class that manages specs for spack test run execution.

property content_hash

The hash used to uniquely identify the test suite.

property current_test_cache_dir

Path to the test stage directory where the current spec’s cached build-time files were automatically copied.

Returns:

path to the current spec’s staged, cached build-time files.

Return type:

str

Raises:

TestSuiteSpecError – If there is no spec being tested

property current_test_data_dir

Path to the test stage directory where the current spec’s custom package (data) files were automatically copied.

Returns:

path to the current spec’s staged, custom package (data) files

Return type:

str

Raises:

TestSuiteSpecError – If there is no spec being tested

ensure_stage()[source]

Ensure the test suite stage directory exists.

static from_dict(d)[source]

Instantiates a TestSuite based on a dictionary specs and an optional alias:

specs: list of the test suite’s specs in dictionary form alias: the test suite alias

Returns:

Instance created from the specs

Return type:

TestSuite

static from_file(filename)[source]

Instantiate a TestSuite using the specs and optional alias provided in the given file.

Parameters:

filename (str) – The path to the JSON file containing the test suite specs and optional alias.

Raises:

BaseException – sjson.SpackJSONError if problem parsing the file

log_file_for_spec(spec)[source]

The test log file path for the provided spec.

Parameters:

spec (spack.spec.Spec) – instance of the spec under test

Returns:

the path to the spec’s log file

Return type:

str

property name

The name (alias or, if none, hash) of the test suite.

property results_file

The path to the results summary file.

property stage

The root test suite stage directory.

Returns:

the spec’s test stage directory path

Return type:

str

test_dir_for_spec(spec)[source]

The path to the test stage directory for the provided spec.

Parameters:

spec (spack.spec.Spec) – instance of the spec under test

Returns:

the spec’s test stage directory path

Return type:

str

classmethod test_log_name(spec)[source]

The standard log filename for a spec.

Parameters:

spec (spack.spec.Spec) – instance of the spec under test

Returns:

the spec’s log filename

Return type:

str

classmethod test_pkg_id(spec)[source]

The standard install test package identifier.

Parameters:

spec – instance of the spec under test

Returns:

the install test package identifier

Return type:

str

test_status(spec: Spec, externals: bool) TestStatus | None[source]

Determine the overall test results status for the spec.

Parameters:
  • spec – instance of the spec under test

  • externalsTrue if externals are to be tested, else False

Returns:

the spec’s test status if available or None

tested_file_for_spec(spec)[source]

The test status file path for the spec.

Parameters:

spec (spack.spec.Spec) – instance of the spec under test

Returns:

the spec’s test status file path

Return type:

str

classmethod tested_file_name(spec)[source]

The standard test status filename for the spec.

Parameters:

spec (spack.spec.Spec) – instance of the spec under test

Returns:

the spec’s test status filename

Return type:

str

to_dict()[source]

Build a dictionary for the test suite.

Returns:

The dictionary contains entries for up to two keys:

specs: list of the test suite’s specs in dictionary form alias: the alias, or name, given to the test suite if provided

Return type:

dict

write_reproducibility_data()[source]
write_test_result(spec, result)[source]

Write the spec’s test result to the test suite results file.

Parameters:
  • spec (spack.spec.Spec) – instance of the spec under test

  • result (str) – result from the spec’s test execution (e.g, PASSED)

exception spack.install_test.TestSuiteError(message, long_message=None)[source]

Bases: SpackError

Raised when there is an error with the test suite.

exception spack.install_test.TestSuiteFailure(num_failures)[source]

Bases: SpackError

Raised when one or more tests in a suite have failed.

exception spack.install_test.TestSuiteNameError(message, long_message=None)[source]

Bases: SpackError

Raised when there is an issue with the naming of the test suite.

exception spack.install_test.TestSuiteSpecError(message, long_message=None)[source]

Bases: SpackError

Raised when there is an issue associated with the spec being tested.

spack.install_test.cache_extra_test_sources(pkg: Pb, srcs: str | List[str])[source]

Copy relative source paths to the corresponding install test subdir

This routine is intended as an optional install test setup helper for grabbing source files/directories during the installation process and copying them to the installation test subdirectory for subsequent use during install testing.

Parameters:
  • pkg – package being tested

  • srcs – relative path for file(s) and or subdirectory(ies) located in the staged source path that are to be copied to the corresponding location(s) under the install testing directory.

Raises:

spack.installer.InstallError – if any of the source paths are absolute or do not exist under the build stage

spack.install_test.check_outputs(expected: list | set | str, actual: str)[source]

Ensure the expected outputs are contained in the actual outputs.

Parameters:
  • expected – expected raw output string(s)

  • actual – actual output string

Raises:

RuntimeError – the expected output is not found in the actual output

spack.install_test.copy_test_files(pkg: Pb, test_spec: Spec)[source]

Copy the spec’s cached and custom test files to the test stage directory.

Parameters:
  • pkg – package being tested

  • test_spec – spec being tested, where the spec may be virtual

Raises:

TestSuiteError – package must be part of an active test suite

spack.install_test.find_required_file(root: str, filename: str, expected: int = 1, recursive: bool = True) str | List[str][source]

Find the required file(s) under the root directory.

Parameters:
  • root – root directory for the search

  • filename – name of the file being located

  • expected – expected number of files to be found under the directory (default is 1)

  • recursiveTrue if subdirectories are to be recursively searched, else False (default is True)

Returns: the path(s), relative to root, to the required file(s)

Raises:

Exception – SkipTest when number of files detected does not match expected

spack.install_test.get_all_test_suites()[source]

Retrieves all validly staged TestSuites

Returns:

a list of TestSuite objects, which may be empty if there are none

Return type:

list

spack.install_test.get_escaped_text_output(filename: str) List[str][source]

Retrieve and escape the expected text output from the file

Parameters:

filename – path to the file

Returns:

escaped text lines read from the file

spack.install_test.get_named_test_suites(name)[source]

Retrieves test suites with the provided name.

Returns:

a list of matching TestSuite instances, which may be empty if none

Return type:

list

Raises:

Exception – TestSuiteNameError if no name is provided

spack.install_test.get_test_stage_dir()[source]

Retrieves the config:test_stage path to the configured test stage root directory

Returns:

absolute path to the configured test stage root or, if none,

the default test stage path

Return type:

str

spack.install_test.get_test_suite(name: str) TestSuite | None[source]

Ensure there is only one matching test suite with the provided name.

Returns:

the name if one matching test suite, else None

Raises:

TestSuiteNameError – If there are more than one matching TestSuites

spack.install_test.install_test_root(pkg: Pb)[source]

The install test root directory.

Parameters:

pkg – package being tested

spack.install_test.overall_status(current_status: TestStatus, substatuses: List[TestStatus]) TestStatus[source]

Determine the overall status based on the current and associated sub status values.

Parameters:
  • current_status – current overall status, assumed to default to PASSED

  • substatuses – status of each test part or overall status of each test spec

Returns:

test status encompassing the main test and all subtests

spack.install_test.print_message(logger: nixlog | winlog, msg: str, verbose: bool = False)[source]

Print the message to the log, optionally echoing.

Parameters:
  • logger – instance of the output logger (e.g. nixlog or winlog)

  • msg – message being output

  • verboseTrue displays verbose output, False suppresses it (False is default)

spack.install_test.process_test_parts(pkg: Pb, test_specs: List[Spec], verbose: bool = False)[source]

Process test parts associated with the package.

Parameters:
  • pkg – package being tested

  • test_specs – list of test specs

  • verbose – Display verbose output (suppress by default)

Raises:

TestSuiteError – package must be part of an active test suite

spack.install_test.results_filename = 'results.txt'

Name of the test suite results (summary) file

spack.install_test.spack_install_test_log = 'install-time-test-log.txt'

Name of the Spack install phase-time test log file

spack.install_test.test_function_names(pkg: Pb | Type[Pb], add_virtuals: bool = False) List[str][source]

Grab the names of all non-empty test functions.

Parameters:
  • pkg – package or package class of interest

  • add_virtualsTrue adds test methods of provided package virtual, False only returns test functions of the package

Returns:

names of non-empty test functions

Raises:

ValueError – occurs if pkg is not a package class

spack.install_test.test_functions(pkg: Pb | Type[Pb], add_virtuals: bool = False) List[Tuple[str, Callable]][source]

Grab all non-empty test functions.

Parameters:
  • pkg – package or package class of interest

  • add_virtualsTrue adds test methods of provided package virtual, False only returns test functions of the package

Returns:

list of non-empty test functions’ (name, function)

Raises:

ValueError – occurs if pkg is not a package class

spack.install_test.test_part(pkg: Pb, test_name: str, purpose: str, work_dir: str = '.', verbose: bool = False)[source]
spack.install_test.test_process(pkg: Pb, kwargs)[source]
spack.install_test.test_suite_filename = 'test_suite.lock'

Name of the test suite’s (JSON) lock file

spack.install_test.virtuals(pkg)[source]

Return a list of unique virtuals for the package.

Parameters:

pkg – package of interest

Returns: names of unique virtual packages

spack.install_test.write_test_suite_file(suite)[source]

Write the test suite to its (JSON) lock file.

spack.install_test.write_test_summary(counts: Counter)[source]

Write summary of the totals for each relevant status category.

Parameters:

counts – counts of the occurrences of relevant test status types

spack.installer module

This module encapsulates package installation functionality.

The PackageInstaller coordinates concurrent builds of packages for the same Spack instance by leveraging the dependency DAG and file system locks. It also proceeds with the installation of non-dependent packages of failed dependencies in order to install as many dependencies of a package as possible.

Bottom-up traversal of the dependency DAG while prioritizing packages with no uninstalled dependencies allows multiple processes to perform concurrent builds of separate packages associated with a spec.

File system locks enable coordination such that no two processes attempt to build the same or a failed dependency package.

Failures to install dependency packages result in removal of their dependents’ build tasks from the current process. A failure file is also written (and locked) so that other processes can detect the failure and adjust their build tasks accordingly.

This module supports the coordination of local and distributed concurrent installations of packages in a Spack instance.

exception spack.installer.BadInstallPhase(pkg_name, phase)[source]

Bases: InstallError

Raised for an install phase option is not allowed for a package.

class spack.installer.BuildProcessInstaller(pkg: PackageBase, install_args: dict)[source]

Bases: object

This class implements the part installation that happens in the child process.

run() bool[source]

Main entry point from build_process to kick off install in child.

class spack.installer.BuildRequest(pkg: PackageBase, install_args: dict)[source]

Bases: object

Class for representing an installation request.

get_depflags(pkg: PackageBase) int[source]

Determine the required dependency types for the associated package.

Parameters:

pkg – explicit or implicit package being installed

Returns:

required dependency type(s) for the package

Return type:

tuple

has_dependency(dep_id) bool[source]

Returns True if the package id represents a known dependency of the requested package, False otherwise.

run_tests(pkg: PackageBase) bool[source]

Determine if the tests should be run for the provided packages

Parameters:

pkg – explicit or implicit package being installed

Returns:

True if they should be run; False otherwise

Return type:

bool

property spec: Spec

The specification associated with the package.

traverse_dependencies(spec=None, visited=None) Iterator[Spec][source]

Yield any dependencies of the appropriate type(s)

class spack.installer.BuildTask(pkg: PackageBase, request: BuildRequest | None, compiler: bool, start: float, attempts: int, status: str, installed: Set[str])[source]

Bases: object

Class for representing the build task for a package.

add_dependent(pkg_id: str) None[source]

Ensure the dependent package id is in the task’s list so it will be properly updated when this package is installed.

Parameters:

pkg_id – package identifier of the dependent package

property cache_only: bool
property explicit: bool

The package was explicitly requested by the user.

flag_installed(installed: List[str]) None[source]

Ensure the dependency is not considered to still be uninstalled.

Parameters:

installed – the identifiers of packages that have been installed so far

property is_root: bool

The package was requested directly, but may or may not be explicit in an environment.

property key: Tuple[int, int]

The key is the tuple (# uninstalled dependencies, sequence).

next_attempt(installed) BuildTask[source]

Create a new, updated task for the next installation attempt.

property priority

The priority is based on the remaining uninstalled dependencies.

property use_cache: bool
exception spack.installer.ExternalPackageError(message, long_msg=None, pkg=None)[source]

Bases: InstallError

Raised by install() when a package is only for external use.

class spack.installer.InstallAction[source]

Bases: object

INSTALL = 1

Do a standard install

NONE = 0

Don’t perform an install

OVERWRITE = 2

Do an overwrite install

exception spack.installer.InstallError(message, long_msg=None, pkg=None)[source]

Bases: SpackError

Raised when something goes wrong during install or uninstall.

The error can be annotated with a pkg attribute to allow the caller to get the package for which the exception was raised.

exception spack.installer.InstallLockError(message, long_msg=None, pkg=None)[source]

Bases: InstallError

Raised during install when something goes wrong with package locking.

class spack.installer.InstallStatus(pkg_count: int)[source]

Bases: object

get_progress() str[source]
next_pkg(pkg: PackageBase)[source]
set_term_title(text: str)[source]
class spack.installer.OverwriteInstall(installer: PackageInstaller, database: Database, task: BuildTask, install_status: InstallStatus)[source]

Bases: object

install()[source]

Try to run the install task overwriting the package prefix. If this fails, try to recover the original install prefix. If that fails too, mark the spec as uninstalled. This function always the original install error if installation fails.

class spack.installer.PackageInstaller(installs: List[Tuple[PackageBase, dict]] = [])[source]

Bases: object

Class for managing the install process for a Spack instance based on a bottom-up DAG approach.

This installer can coordinate concurrent batch and interactive, local and distributed (on a shared file system) builds for the same Spack instance.

install() None[source]

Install the requested package(s) and or associated dependencies.

spack.installer.STATUS_ADDED = 'queued'

Build status indicating task has been added.

spack.installer.STATUS_DEQUEUED = 'dequeued'

Build status indicating the task has been popped from the queue

spack.installer.STATUS_FAILED = 'failed'

Build status indicating the spec failed to install

spack.installer.STATUS_INSTALLED = 'installed'

Build status indicating the spec was sucessfully installed

spack.installer.STATUS_INSTALLING = 'installing'

Build status indicating the spec is being installed (possibly by another process)

spack.installer.STATUS_REMOVED = 'removed'

Build status indicating task has been removed (to maintain priority queue invariants).

class spack.installer.TermStatusLine(enabled: bool)[source]

Bases: object

This class is used in distributed builds to inform the user that other packages are being installed by another process.

add(pkg_id: str)[source]

Add a package to the waiting list, and if it is new, update the status line.

clear()[source]

Clear the status line.

exception spack.installer.UpstreamPackageError(message, long_msg=None, pkg=None)[source]

Bases: InstallError

Raised during install when something goes wrong with an upstream package.

spack.installer.archive_install_logs(pkg: PackageBase, phase_log_dir: str) None[source]

Copy install logs to their destination directory(ies) :param pkg: the package that was built and installed :param phase_log_dir: path to the archive directory

spack.installer.build_process(pkg: PackageBase, install_args: dict) bool[source]

Perform the installation/build of the package.

This runs in a separate child process, and has its own process and python module space set up by build_environment.start_build_process().

This essentially wraps an instance of BuildProcessInstaller so that we can more easily create one in a subprocess.

This function’s return value is returned to the parent process.

Parameters:
  • pkg – the package being installed.

  • install_args – arguments to do_install() from parent process.

spack.installer.combine_phase_logs(phase_log_files: List[str], log_path: str) None[source]

Read set or list of logs and combine them into one file.

Each phase will produce it’s own log, so this function aims to cat all the separate phase log output files into the pkg.log_path. It is written generally to accept some list of files, and a log path to combine them to.

Parameters:
  • phase_log_files – a list or iterator of logs to combine

  • log_path – the path to combine them to

spack.installer.dump_packages(spec: Spec, path: str) None[source]

Dump all package information for a spec and its dependencies.

This creates a package repository within path for every namespace in the spec DAG, and fills the repos with package files and patch files for every node in the DAG.

Parameters:
  • spec – the Spack spec whose package information is to be dumped

  • path – the path to the build packages directory

spack.installer.get_dependent_ids(spec: Spec) List[str][source]

Return a list of package ids for the spec’s dependents

Parameters:

spec – Concretized spec

Returns: list of package ids

spack.installer.install_msg(name: str, pid: int, install_status: InstallStatus) str[source]

Colorize the name/id of the package being installed

Parameters:
  • name – Name/id of the package being installed

  • pid – id of the installer process

Return: Colorized installing message

spack.installer.log(pkg: PackageBase) None[source]

Copy provenance into the install directory on success

Parameters:

pkg – the package that was built and installed

spack.installer.package_id(pkg: PackageBase) str[source]

A “unique” package identifier for installation purposes

The identifier is used to track build tasks, locks, install, and failure statuses.

The identifier needs to distinguish between combinations of compilers and packages for combinatorial environments.

Parameters:

pkg – the package from which the identifier is derived

spack.installer.print_install_test_log(pkg: PackageBase) None[source]

Output install test log file path but only if have test failures.

Parameters:

pkg – instance of the package under test

spack.main module

This is the implementation of the Spack command line executable.

In a normal Spack installation, this is invoked from the bin/spack script after the system path is set up.

spack.main.SHOW_BACKTRACE = False

Whether to print backtraces on error

class spack.main.SpackArgumentParser(prog=None, usage=None, description=None, epilog=None, parents=[], formatter_class=<class 'argparse.HelpFormatter'>, prefix_chars='-', fromfile_prefix_chars=None, argument_default=None, conflict_handler='error', add_help=True, allow_abbrev=True, exit_on_error=True)[source]

Bases: ArgumentParser

add_command(cmd_name)[source]

Add one subcommand to this parser.

add_subparsers(**kwargs)[source]

Ensure that sensible defaults are propagated to subparsers

format_help(level='short')[source]
format_help_sections(level)[source]

Format help on sections for a particular verbosity level.

Parameters:

level (str) – ‘short’ or ‘long’ (more commands shown for long)

class spack.main.SpackCommand(command_name, subprocess=False)[source]

Bases: object

Callable object that invokes a spack command (for testing).

Example usage:

install = SpackCommand('install')
install('-v', 'mpich')

Use this to invoke Spack commands directly from Python and check their output.

exception spack.main.SpackCommandError[source]

Bases: Exception

Raised when SpackCommand execution fails.

class spack.main.SpackHelpFormatter(prog, indent_increment=2, max_help_position=24, width=None)[source]

Bases: RawTextHelpFormatter

add_arguments(actions)[source]
spack.main.add_all_commands(parser)[source]

Add all spack subcommands to the parser.

spack.main.allows_unknown_args(command)[source]

Implements really simple argument injection for unknown arguments.

Commands may add an optional argument called “unknown args” to indicate they can handle unknonwn args, and we’ll pass the unknown args in.

spack.main.finish_parse_and_run(parser, cmd_name, main_args, env_format_error)[source]

Finish parsing after we know the command to run.

spack.main.get_spack_commit()[source]

Get the Spack git commit sha.

Returns:

(str or None) the commit sha if available, otherwise None

spack.main.get_version()[source]

Get a descriptive version of this instance of Spack.

Outputs ‘<PEP440 version> (<git commit sha>)’.

The commit sha is only added when available.

spack.main.index_commands()[source]

create an index of commands by section for this help level

spack.main.intro_by_level = {'long': 'Complete list of spack commands:', 'short': 'These are common spack commands:'}

intro text for help at different levels

spack.main.levels = ['short', 'long']

help levels in order of detail (i.e., number of commands shown)

spack.main.main(argv=None)[source]

This is the entry point for the Spack command.

main() itself is just an error handler – it handles errors for everything in Spack that makes it to the top level.

The logic is all in _main().

Parameters:

argv (list or None) – command line arguments, NOT including the executable name. If None, parses from sys.argv.

spack.main.make_argument_parser(**kwargs)[source]

Create an basic argument parser without any subcommands added.

spack.main.options_by_level = {'long': 'all', 'short': ['h', 'k', 'V', 'color']}

control top-level spack options shown in basic vs. advanced help

spack.main.print_setup_info(*info)[source]

Print basic information needed by setup-env.[c]sh.

Parameters:

info (list) – list of things to print: comma-separated list of ‘csh’, ‘sh’, or ‘modules’

This is in main.py to make it fast; the setup scripts need to invoke spack in login scripts, and it needs to be quick.

spack.main.required_command_properties = ['level', 'section', 'description']

Properties that commands are required to set.

spack.main.resolve_alias(cmd_name: str, cmd: List[str]) Tuple[str, List[str]][source]

Resolves aliases in the given command.

Parameters:
  • cmd_name – command name.

  • cmd – command line arguments.

Returns:

new command name and arguments.

spack.main.restore_macos_dyld_vars()[source]

Spack mutates DYLD_* variables in spack load and spack env activate. Unlike Linux, macOS SIP clears these variables in new processes, meaning that os.environ[“DYLD_*”] in our Python process is not the same as the user’s shell. Therefore, we store the user’s DYLD_* variables in SPACK_DYLD_* and restore them here.

spack.main.section_descriptions = {'admin': 'administration', 'basic': 'query packages', 'build': 'build packages', 'config': 'configuration', 'developer': 'developer', 'environment': 'environment', 'extensions': 'extensions', 'help': 'more help', 'packaging': 'create packages', 'system': 'system'}

Longer text for each section, to show in help

spack.main.section_order = {'basic': ['list', 'info', 'find'], 'build': ['fetch', 'stage', 'patch', 'configure', 'build', 'restage', 'install', 'uninstall', 'clean'], 'packaging': ['create', 'edit']}

preferential command order for some sections (e.g., build pipeline is in execution order, not alphabetical)

spack.main.send_warning_to_tty(message, *args)[source]

Redirects messages to tty.warn.

spack.main.set_working_dir()[source]

Change the working directory to getcwd, or spack prefix if no cwd.

spack.main.setup_main_options(args)[source]

Configure spack globals based on the basic options.

spack.main.spack_working_dir = None

Recorded directory where spack command was originally invoked

spack.main.stat_names = {'calls': (((1, -1),), 'call count'), 'cumtime': (((3, -1),), 'cumulative time'), 'cumulative': (((3, -1),), 'cumulative time'), 'filename': (((4, 1),), 'file name'), 'line': (((5, 1),), 'line number'), 'module': (((4, 1),), 'file name'), 'name': (((6, 1),), 'function name'), 'ncalls': (((1, -1),), 'call count'), 'nfl': (((6, 1), (4, 1), (5, 1)), 'name/file/line'), 'pcalls': (((0, -1),), 'primitive call count'), 'stdname': (((7, 1),), 'standard name'), 'time': (((2, -1),), 'internal time'), 'tottime': (((2, -1),), 'internal time')}

names of profile statistics

spack.mirror module

This file contains code for creating spack mirror directories. A mirror is an organized hierarchy containing specially named archive files. This enabled spack to know where to find files in a mirror if the main server for a particular package is down. Or, if the computer where spack is run is not connected to the internet, it allows spack to download packages directly from a mirror (e.g., on an intranet).

class spack.mirror.Mirror(data: str | dict, name: str | None = None)[source]

Bases: object

Represents a named location for storing source tarballs and binary packages.

Mirrors have a fetch_url that indicate where and how artifacts are fetched from them, and a push_url that indicate where and how artifacts are pushed to them. These two URLs are usually the same.

property binary
display(max_len=0)[source]
property fetch_url

Get the valid, canonicalized fetch URL

static from_json(stream, name=None)[source]
static from_local_path(path: str)[source]
static from_url(url: str)[source]

Create an anonymous mirror by URL. This method validates the URL.

static from_yaml(stream, name=None)[source]
get_access_pair(direction: str) List | None[source]
get_access_token(direction: str) str | None[source]
get_endpoint_url(direction: str) str | None[source]
get_profile(direction: str) str | None[source]
get_url(direction: str) str[source]
property name
property push_url

Get the valid, canonicalized fetch URL

property signed: bool
property source
to_dict()[source]
to_json(stream=None)[source]
to_yaml(stream=None)[source]
update(data: dict, direction: str | None = None) bool[source]

Modify the mirror with the given data. This takes care of expanding trivial mirror definitions by URL to something more rich with a dict if necessary

Parameters:
  • data (dict) – The data to update the mirror with.

  • direction (str) – The direction to update the mirror in (fetch or push or None for top-level update)

Returns:

True if the mirror was updated, False otherwise.

Return type:

bool

class spack.mirror.MirrorCollection(mirrors=None, scope=None, binary: bool | None = None, source: bool | None = None)[source]

Bases: Mapping

A mapping of mirror names to mirrors.

display()[source]
static from_dict(d)[source]
static from_json(stream, name=None)[source]
static from_yaml(stream, name=None)[source]
lookup(name_or_url)[source]

Looks up and returns a Mirror.

If this MirrorCollection contains a named Mirror under the name [name_or_url], then that mirror is returned. Otherwise, [name_or_url] is assumed to be a mirror URL, and an anonymous mirror with the given URL is returned.

to_dict(recursive=False)[source]
to_json(stream=None)[source]
to_yaml(stream=None)[source]
exception spack.mirror.MirrorError(msg, long_msg=None)[source]

Bases: SpackError

Superclass of all mirror-creation related errors.

class spack.mirror.MirrorReference(cosmetic_path, global_path=None)[source]

Bases: object

A MirrorReference stores the relative paths where you can store a package/resource in a mirror directory.

The appropriate storage location is given by storage_path. The cosmetic_path property provides a reference that a human could generate themselves based on reading the details of the package.

A user can iterate over a MirrorReference object to get all the possible names that might be used to refer to the resource in a mirror; this includes names generated by previous naming schemes that are no-longer reported by storage_path or cosmetic_path.

property storage_path
class spack.mirror.MirrorStats[source]

Bases: object

added(resource)[source]
already_existed(resource)[source]
error()[source]
next_spec(spec)[source]
stats()[source]
class spack.mirror.OCIImageLayout(digest: Digest)[source]

Bases: object

Follow the OCI Image Layout Specification to archive blobs

Paths are of the form blobs/<algorithm>/<digest>

spack.mirror.add(mirror: Mirror, scope=None)[source]

Add a named mirror in the given scope

spack.mirror.create(path, specs, skip_unstable_versions=False)[source]

Create a directory to be used as a spack mirror, and fill it with package archives.

Parameters:
  • path – Path to create a mirror directory hierarchy in.

  • specs – Any package versions matching these specs will be added to the mirror.

  • skip_unstable_versions – if true, this skips adding resources when they do not have a stable archive checksum (as determined by fetch_strategy.stable_target)

Return Value:

Returns a tuple of lists: (present, mirrored, error)

  • present: Package specs that were already present.

  • mirrored: Package specs that were successfully mirrored.

  • error: Package specs that failed to mirror due to some error.

spack.mirror.create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats)[source]

Add a single package object to a mirror.

The package object is only required to have an associated spec with a concrete version.

Parameters:
Returns:

True if the spec was added successfully, False otherwise

spack.mirror.get_all_versions(specs)[source]

Given a set of initial specs, return a new set of specs that includes each version of each package in the original set.

Note that if any spec in the original set specifies properties other than version, this information will be omitted in the new set; for example; the new set of specs will not include variant settings.

spack.mirror.get_matching_versions(specs, num_versions=1)[source]

Get a spec for EACH known version matching any spec in the list. For concrete specs, this retrieves the concrete version and, if more than one version per spec is requested, retrieves the latest versions of the package.

spack.mirror.mirror_archive_paths(fetcher, per_package_ref, spec=None)[source]

Returns a MirrorReference object which keeps track of the relative storage path of the resource associated with the specified fetcher.

spack.mirror.mirror_cache_and_stats(path, skip_unstable_versions=False)[source]

Return both a mirror cache and a mirror stats, starting from the path where a mirror ought to be created.

Parameters:
  • path (str) – path to create a mirror directory hierarchy in.

  • skip_unstable_versions – if true, this skips adding resources when they do not have a stable archive checksum (as determined by fetch_strategy.stable_target)

spack.mirror.remove(name, scope)[source]

Remove the named mirror in the given scope

spack.mirror.require_mirror_name(mirror_name)[source]

Find a mirror by name and raise if it does not exist

spack.mirror.supported_url_schemes = ('file', 'http', 'https', 'sftp', 'ftp', 's3', 'gs', 'oci')

What schemes do we support

spack.mixins module

This module contains additional behavior that can be attached to any given package.

spack.mixins.filter_compiler_wrappers(*files, **kwargs)[source]

Substitutes any path referring to a Spack compiler wrapper with the path of the underlying compiler that has been used.

If this isn’t done, the files will have CC, CXX, F77, and FC set to Spack’s generic cc, c++, f77, and f90. We want them to be bound to whatever compiler they were built with.

Parameters:
  • *files – files to be filtered relative to the search root (which is, by default, the installation prefix)

  • **kwargs

    allowed keyword arguments

    after

    specifies after which phase the files should be filtered (defaults to ‘install’)

    relative_root

    path relative to prefix where to start searching for the files to be filtered. If not set the install prefix wil be used as the search root. It is highly recommended to set this, as searching from the installation prefix may affect performance severely in some cases.

    ignore_absent, backup

    these two keyword arguments, if present, will be forwarded to filter_file (see its documentation for more information on their behavior)

    recursive

    this keyword argument, if present, will be forwarded to find (see its documentation for more information on the behavior)

spack.multimethod module

This module contains utilities for using multi-methods in spack. You can think of multi-methods like overloaded methods – they’re methods with the same name, and we need to select a version of the method based on some criteria. e.g., for overloaded methods, you would select a version of the method to call based on the types of its arguments.

In spack, multi-methods are used to ease the life of package authors. They allow methods like install() (or other methods called by install()) to declare multiple versions to be called when the package is instantiated with different specs. e.g., if the package is built with OpenMPI on x86_64,, you might want to call a different install method than if it was built for mpich2 on BlueGene/Q. Likewise, you might want to do a different type of install for different versions of the package.

Multi-methods provide a simple decorator-based syntax for this that avoids overly complicated rat nests of if statements. Obviously, depending on the scenario, regular old conditionals might be clearer, so package authors should use their judgement.

exception spack.multimethod.MultiMethodError(message)[source]

Bases: SpackError

Superclass for multimethod dispatch errors

class spack.multimethod.MultiMethodMeta(name, bases, attr_dict)[source]

Bases: type

This allows us to track the class’s dict during instantiation.

exception spack.multimethod.NoSuchMethodError(cls, method_name, spec, possible_specs)[source]

Bases: SpackError

Raised when we can’t find a version of a multi-method.

class spack.multimethod.SpecMultiMethod(default=None)[source]

Bases: object

This implements a multi-method for Spack specs. Packages are instantiated with a particular spec, and you may want to execute different versions of methods based on what the spec looks like. For example, you might want to call a different version of install() for one platform than you call on another.

The SpecMultiMethod class implements a callable object that handles method dispatch. When it is called, it looks through registered methods and their associated specs, and it tries to find one that matches the package’s spec. If it finds one (and only one), it will call that method.

This is intended for use with decorators (see below). The decorator (see docs below) creates SpecMultiMethods and registers method versions with them.

To register a method, you can do something like this:

mm = SpecMultiMethod() mm.register(“^chaos_5_x86_64_ib”, some_method)

The object registered needs to be a Spec or some string that will parse to be a valid spec.

When the mm is actually called, it selects a version of the method to call based on the sys_type of the object it is called on.

See the docs for decorators below for more details.

register(spec, method)[source]

Register a version of a method for a particular spec.

spack.multimethod.default_args(**kwargs)[source]
class spack.multimethod.when(condition)[source]

Bases: object

spack.package module

spack.util.package is a set of useful build tools and directives for packages.

Everything in this module is automatically imported into Spack package files.

spack.package_base module

This is where most of the action happens in Spack.

The spack package class structure is based strongly on Homebrew (http://brew.sh/), mainly because Homebrew makes it very easy to create packages.

exception spack.package_base.ActivationError(msg, long_msg=None)[source]

Bases: ExtensionError

Raised when there are problems activating an extension.

exception spack.package_base.DependencyConflictError(conflict)[source]

Bases: SpackError

Raised when the dependencies cannot be flattened as asked for.

class spack.package_base.DetectablePackageMeta(name, bases, attr_dict)[source]

Bases: type

Check if a package is detectable and add default implementations for the detection function.

TAG = 'detectable'
exception spack.package_base.ExtensionError(message, long_msg=None)[source]

Bases: PackageError

Superclass for all errors having to do with extension packages.

spack.package_base.FLAG_HANDLER_TYPE

Allowed URL schemes for spack packages.

alias of Callable[[str, Iterable[str]], Tuple[Optional[Iterable[str]], Optional[Iterable[str]], Optional[Iterable[str]]]]

exception spack.package_base.InvalidPackageOpError(message, long_msg=None)[source]

Bases: PackageError

Raised when someone tries perform an invalid operation on a package.

exception spack.package_base.ManualDownloadRequiredError(message, long_msg=None)[source]

Bases: InvalidPackageOpError

Raised when attempting an invalid operation on a package that requires a manual download.

exception spack.package_base.NoURLError(cls)[source]

Bases: PackageError

Raised when someone tries to build a URL for a package with no URLs.

class spack.package_base.PackageBase(spec)[source]

Bases: WindowsRPath, PackageViewMixin

This is the superclass for all spack packages.

*The Package class*

At its core, a package consists of a set of software to be installed. A package may focus on a piece of software and its associated software dependencies or it may simply be a set, or bundle, of software. The former requires defining how to fetch, verify (via, e.g., sha256), build, and install that software and the packages it depends on, so that dependencies can be installed along with the package itself. The latter, sometimes referred to as a no-source package, requires only defining the packages to be built.

Packages are written in pure Python.

There are two main parts of a Spack package:

  1. The package class. Classes contain directives, which are special functions, that add metadata (versions, patches, dependencies, and other information) to packages (see directives.py). Directives provide the constraints that are used as input to the concretizer.

  2. Package instances. Once instantiated, a package is essentially a software installer. Spack calls methods like do_install() on the Package object, and it uses those to drive user-implemented methods like patch(), install(), and other build steps. To install software, an instantiated package needs a concrete spec, which guides the behavior of the various install methods.

Packages are imported from repos (see repo.py).

Package DSL

Look in lib/spack/docs or check https://spack.readthedocs.io for the full documentation of the package domain-specific language. That used to be partially documented here, but as it grew, the docs here became increasingly out of date.

Package Lifecycle

A package’s lifecycle over a run of Spack looks something like this:

p = Package()             # Done for you by spack

p.do_fetch()              # downloads tarball from a URL (or VCS)
p.do_stage()              # expands tarball in a temp directory
p.do_patch()              # applies patches to expanded source
p.do_install()            # calls package's install() function
p.do_uninstall()          # removes install directory

although packages that do not have code have nothing to fetch so omit p.do_fetch().

There are also some other commands that clean the build area:

p.do_clean()              # removes the stage directory entirely
p.do_restage()            # removes the build directory and
                          # re-expands the archive.

The convention used here is that a do_* function is intended to be called internally by Spack commands (in spack.cmd). These aren’t for package writers to override, and doing so may break the functionality of the Package class.

Package creators have a lot of freedom, and they could technically override anything in this class. That is not usually required.

For most use cases. Package creators typically just add attributes like homepage and, for a code-based package, url, or functions such as install(). There are many custom Package subclasses in the spack.build_systems package that make things even easier for specific build systems.

classmethod all_patches()[source]

Retrieve all patches associated with the package.

Retrieves patches on the package itself as well as patches on the dependencies of the package.

property all_urls: List[str]

A list of all URLs in a package.

Check both class-level and version-specific URLs.

Returns a list of URLs

all_urls_for_version(version: StandardVersion) List[str][source]

Return all URLs derived from version_urls(), url, urls, and list_url (if it contains a version) in a package in that order.

Parameters:

version – the version for which a URL is sought

archive_install_test_log()[source]

Archive the install-phase test log, if present.

classmethod build_system_flags(name: str, flags: Iterable[str]) Tuple[Iterable[str] | None, Iterable[str] | None, Iterable[str] | None][source]

flag_handler that passes flags to the build system arguments. Any package using build_system_flags must also implement flags_to_build_system_args, or derive from a class that implements it. Currently, AutotoolsPackage and CMakePackage implement it.

property builder
cache_extra_test_sources(srcs)[source]

Copy relative source paths to the corresponding install test subdir

This method is intended as an optional install test setup helper for grabbing source files/directories during the installation process and copying them to the installation test subdirectory for subsequent use during install testing.

Parameters:

srcs (str or list) – relative path for files and or subdirectories located in the staged source path that are to be copied to the corresponding location(s) under the install testing directory.

property cmake_prefix_paths
property compiler

Get the spack.compiler.Compiler object used to build this package

property configure_args_path

Return the configure args file path associated with staging.

conflicts: Dict[Spec, List[Tuple[Spec, str | None]]]
content_hash(content=None)[source]

Create a hash based on the artifacts and patches used to build this package.

This includes:
  • source artifacts (tarballs, repositories) used to build;

  • content hashes (sha256’s) of all patches applied by Spack; and

  • canonicalized contents the package.py recipe used to build.

This hash is only included in Spack’s DAG hash for concrete specs, but if it happens to be called on a package with an abstract spec, only applicable (i.e., determinable) portions of the hash will be included.

dependencies: Dict[Spec, Dict[str, Dependency]]
classmethod dependencies_by_name(when: bool = False)[source]
classmethod dependencies_of_type(deptypes: int)[source]

Get names of dependencies that can possibly have these deptypes.

This analyzes the package and determines which dependencies can be a certain kind of dependency. Note that they may not always be this kind of dependency, since dependencies can be optional, so something may be a build dependency in one configuration and a run dependency in another.

classmethod dependency_names()[source]
do_clean()[source]

Removes the package’s build stage and source tarball.

do_deprecate(deprecator, link_fn)[source]

Deprecate this package in favor of deprecator spec

do_fetch(mirror_only=False)[source]

Creates a stage directory and downloads the tarball for this package. Working directory will be set to the stage directory.

do_install(**kwargs)[source]

Called by commands to install a package and or its dependencies.

Package implementations should override install() to describe their build process.

Parameters:
  • cache_only (bool) – Fail if binary package unavailable.

  • dirty (bool) – Don’t clean the build environment before installing.

  • explicit (bool) – True if package was explicitly installed, False if package was implicitly installed (as a dependency).

  • fail_fast (bool) – Fail if any dependency fails to install; otherwise, the default is to install as many dependencies as possible (i.e., best effort installation).

  • fake (bool) – Don’t really build; install fake stub files instead.

  • force (bool) – Install again, even if already installed.

  • install_deps (bool) – Install dependencies before installing this package

  • install_source (bool) – By default, source is not installed, but for debugging it might be useful to keep it around.

  • keep_prefix (bool) – Keep install prefix on failure. By default, destroys it.

  • keep_stage (bool) – By default, stage is destroyed only if there are no exceptions during build. Set to True to keep the stage even with exceptions.

  • restage (bool) – Force spack to restage the package source.

  • skip_patch (bool) – Skip patch stage of build if True.

  • stop_before (str) – stop execution before this installation phase (or None)

  • stop_at (str) – last installation phase to be executed (or None)

  • tests (bool or list or set) – False to run no tests, True to test all packages, or a list of package names to run tests for some

  • use_cache (bool) – Install from binary package, if available.

  • verbose (bool) – Display verbose build output (by default, suppresses it)

do_patch()[source]

Applies patches if they haven’t been applied already.

do_restage()[source]

Reverts expanded/checked out source to a pristine state.

do_stage(mirror_only=False)[source]

Unpacks and expands the fetched tarball.

do_test(dirty=False, externals=False)[source]
do_uninstall(force=False)[source]

Uninstall this package by spec.

property download_instr

Defines the default manual download instructions. Packages can override the property to provide more information.

Returns:

default manual download instructions

Return type:

(str)

classmethod env_flags(name: str, flags: Iterable[str]) Tuple[Iterable[str] | None, Iterable[str] | None, Iterable[str] | None][source]

flag_handler that adds all flags to canonical environment variables.

property env_mods_path

Return the build environment modifications file path associated with staging.

property env_path

Return the build environment file path associated with staging.

extendable = False

Most packages are NOT extendable. Set to True if you want extensions.

property extendee_spec

Spec of the extendee of this package, or None if it is not an extension

extends(spec)[source]

Returns True if this package extends the given spec.

If self.spec is concrete, this returns whether this package extends the given spec.

If self.spec is not concrete, this returns whether this package may extend the given spec.

fetch_options: Dict[str, Any] = {}

Set of additional options used when fetching package versions.

fetch_remote_versions(concurrency: int | None = None) Dict[StandardVersion, str][source]

Find remote versions of this package.

Uses list_url and any other URLs listed in the package file.

Returns:

a dictionary mapping versions to URLs

Return type:

dict

property fetcher
find_valid_url_for_version(version)[source]

Returns a URL from which the specified version of this package may be downloaded after testing whether the url is valid. Will try url, urls, and list_url before failing.

version: class Version

The version for which a URL is sought.

See Class Version (version.py)

property flag_handler: Callable[[str, Iterable[str]], Tuple[Iterable[str] | None, Iterable[str] | None, Iterable[str] | None]]
flags_to_build_system_args(flags)[source]
classmethod format_doc(**kwargs)[source]

Wrap doc string at 72 characters and format nicely

fullname = 'spack.package_base'
fullnames = ['spack.package_base']
global_license_dir = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/latest/lib/spack/docs/_spack_root/etc/spack/licenses'
property global_license_file

Returns the path where a global license file for this particular package should be stored.

has_code = True

Most Spack packages are used to install source or binary code while those that do not can be used to install a set of other Spack packages.

property home
homepage: str | None = None

Package homepage where users can find more information about the package

classmethod inject_flags(name: str, flags: Iterable[str]) Tuple[Iterable[str] | None, Iterable[str] | None, Iterable[str] | None][source]

flag_handler that injects all flags through the compiler wrapper.

property install_configure_args_path

Return the configure args file path on successful installation.

property install_env_path

Return the build environment file path on successful installation.

property install_log_path

Return the (compressed) build log file path on successful installation

property install_test_root

Return the install test root directory.

property installed
property installed_upstream
property is_extension
keep_werror: str | None = None
license_comment = '#'

String. Contains the symbol used by the license manager to denote a comment. Defaults to #.

license_files: List[str] = []

List of strings. These are files that the software searches for when looking for a license. All file paths must be relative to the installation directory. More complex packages like Intel may require multiple licenses for individual components. Defaults to the empty list.

license_required = False

Boolean. If set to True, this software requires a license. If set to False, all of the license_* attributes will be ignored. Defaults to False.

license_url = ''

String. A URL pointing to license setup instructions for the software. Defaults to the empty string.

license_vars: List[str] = []

List of strings. Environment variables that can be set to tell the software where to look for a license if it is not in the usual location. Defaults to the empty list.

list_depth = 0

Link depth to which list_url should be searched for new versions

list_url: str | None = None

Default list URL (place to find available versions)

property log_path

Return the build log file path associated with staging.

maintainers: List[str] = []

List of strings which contains GitHub usernames of package maintainers. Do not include @ here in order not to unnecessarily ping the users.

manual_download = False

Boolean. Set to True for packages that require a manual download. This is currently used by package sanity tests and generation of a more meaningful fetch failure error.

metadata_attrs = ['homepage', 'url', 'urls', 'list_url', 'extendable', 'parallel', 'make_jobs', 'maintainers', 'tags']

List of attributes to be excluded from a package’s hash.

property metadata_dir

Return the install metadata directory.

module = <module 'spack.package_base' from '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/latest/lib/spack/docs/_spack_root/lib/spack/spack/package_base.py'>
name = 'package_base'
namespace = 'spack'
nearest_url(version)[source]

Finds the URL with the “closest” version to version.

This uses the following precedence order:

  1. Find the next lowest or equal version with a URL.

  2. If no lower URL, return the next higher URL.

  3. If no higher URL, return None.

non_bindable_shared_objects: List[str] = []

List of shared objects that should be replaced with a different library at runtime. Typically includes stub libraries like libcuda.so. When linking against a library listed here, the dependent will only record its soname or filename, not its absolute path, so that the dynamic linker will search for it. Note: accepts both file names and directory names, for example ["libcuda.so", "stubs"] will ensure libcuda.so and all libraries in the stubs directory are not bound by path.”””

package_dir = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/latest/lib/spack/docs/_spack_root/lib/spack/spack'
parallel = True

By default we build in parallel. Subclasses can override this.

patches: Dict[Spec, List[Patch]]
property phase_log_files

Find sorted phase log files written to the staging directory

classmethod possible_dependencies(transitive: bool = True, expand_virtuals: bool = True, depflag: int = 15, visited: dict | None = None, missing: dict | None = None, virtuals: set | None = None) Dict[str, Set[str]][source]

Return dict of possible dependencies of this package.

Parameters:
  • transitive (bool or None) – return all transitive dependencies if True, only direct dependencies if False (default True)..

  • expand_virtuals (bool or None) – expand virtual dependencies into all possible implementations (default True)

  • depflag – dependency types to consider

  • visited (dict or None) – dict of names of dependencies visited so far, mapped to their immediate dependencies’ names.

  • missing (dict or None) – dict to populate with packages and their missing dependencies.

  • virtuals (set) – if provided, populate with virtuals seen so far.

Returns:

dictionary mapping dependency names to their

immediate dependencies

Return type:

(dict)

Each item in the returned dictionary maps a (potentially transitive) dependency of this package to its possible immediate dependencies. If expand_virtuals is False, virtual package names wil be inserted as keys mapped to empty sets of dependencies. Virtuals, if not expanded, are treated as though they have no immediate dependencies.

Missing dependencies by default are ignored, but if a missing dict is provided, it will be populated with package names mapped to any dependencies they have that are in no repositories. This is only populated if transitive is True.

Note: the returned dict includes the package itself.

property prefix

Get the prefix into which this package should be installed.

provided: Dict[Spec, Set[Spec]]
provided_together: Dict[Spec, List[Set[str]]]
classmethod provided_virtual_names()[source]

Return sorted list of names of virtuals that can be provided by this package.

provides(vpkg_name)[source]

True if this package provides a virtual package with the specified name

remove_prefix()[source]

Removes the prefix for a package along with any empty parent directories

requirements: Dict[Spec, List[Tuple[Tuple[Spec, ...], str, str | None]]]
property rpath

Get the rpath this package links with, as a list of paths.

property rpath_args

Get the rpath args as a string, with -Wl,-rpath, for each element

run_after_callbacks = []
run_before_callbacks = []
run_test(exe, options=[], expected=[], status=0, installed=False, purpose=None, skip_missing=False, work_dir=None)[source]

Run the test and confirm the expected results are obtained

Log any failures and continue, they will be re-raised later

Parameters:
  • exe (str) – the name of the executable

  • options (str or list) – list of options to pass to the runner

  • expected (str or list) – list of expected output strings. Each string is a regex expected to match part of the output.

  • status (int or list) – possible passing status values with 0 meaning the test is expected to succeed

  • installed (bool) – if True, the executable must be in the install prefix

  • purpose (str) – message to display before running test

  • skip_missing (bool) – skip the test if the executable is not in the install prefix bin directory or the provided work_dir

  • work_dir (str or None) – path to the smoke test directory

run_tests = False

By default do not run tests within package’s install()

sanity_check_is_dir: List[str] = []

List of prefix-relative directory paths (or a single path). If these do not exist after install, or if they exist but are not directories, sanity checks will fail.

sanity_check_is_file: List[str] = []

List of prefix-relative file paths (or a single path). If these do not exist after install, or if they exist but are not files, sanity checks fail.

setup_dependent_package(module, dependent_spec)[source]

Set up Python module-scope variables for dependent packages.

Called before the install() method of dependents.

Default implementation does nothing, but this can be overridden by an extendable package to set up the module of its extensions. This is useful if there are some common steps to installing all extensions for a certain package.

Examples:

  1. Extensions often need to invoke the python interpreter from the Python installation being extended. This routine can put a python() Executable object in the module scope for the extension package to simplify extension installs.

  2. MPI compilers could set some variables in the dependent’s scope that point to mpicc, mpicxx, etc., allowing them to be called by common name regardless of which MPI is used.

  3. BLAS/LAPACK implementations can set some variables indicating the path to their libraries, since these paths differ by BLAS/LAPACK implementation.

Parameters:
  • module (spack.package_base.PackageBase.module) – The Python module object of the dependent package. Packages can use this to set module-scope variables for the dependent to use.

  • dependent_spec (spack.spec.Spec) – The spec of the dependent package about to be built. This allows the extendee (self) to query the dependent’s state. Note that this package’s spec is available as self.spec.

setup_dependent_run_environment(env, dependent_spec)[source]

Sets up the run environment of packages that depend on this one.

This is similar to setup_run_environment, but it is used to modify the run environments of packages that depend on this one.

This gives packages like Python and others that follow the extension model a way to implement common environment or run-time settings for dependencies.

Parameters:
  • env (spack.util.environment.EnvironmentModifications) – environment modifications to be applied when the dependent package is run. Package authors can call methods on it to alter the build environment.

  • dependent_spec (spack.spec.Spec) – The spec of the dependent package about to be run. This allows the extendee (self) to query the dependent’s state. Note that this package’s spec is available as self.spec

setup_run_environment(env)[source]

Sets up the run environment for a package.

Parameters:

env (spack.util.environment.EnvironmentModifications) – environment modifications to be applied when the package is run. Package authors can call methods on it to alter the run environment.

property stage

Get the build staging area for this package.

This automatically instantiates a Stage object if the package doesn’t have one yet, but it does not create the Stage directory on the filesystem.

test()[source]
test_requires_compiler: bool = False

Set to True to indicate the stand-alone test requires a compiler. It is used to ensure a compiler and build dependencies like ‘cmake’ are available to build a custom test code.

test_suite: TestSuite | None = None

TestSuite instance used to manage stand-alone tests for 1+ specs.

property tester
property times_log_path

Return the times log json file.

transitive_rpaths = True

When True, add RPATHs for the entire DAG. When False, add RPATHs only for immediate dependencies.

static uninstall_by_spec(spec, force=False, deprecator=None)[source]
unit_test_check()[source]

Hook for unit tests to assert things about package internals.

Unit tests can override this function to perform checks after Package.install and all post-install hooks run, but before the database is updated.

The overridden function may indicate that the install procedure should terminate early (before updating the database) by returning False (or any value such that bool(result) is False).

Returns:

True to continue, False to skip install()

Return type:

(bool)

update_external_dependencies(extendee_spec=None)[source]

Method to override in package classes to handle external dependencies

url_for_version(version)[source]

Returns a URL from which the specified version of this package may be downloaded.

version: class Version

The version for which a URL is sought.

See Class Version (version.py)

url_version(version)[source]

Given a version, this returns a string that should be substituted into the package’s URL to download that version.

By default, this just returns the version string. Subclasses may need to override this, e.g. for boost versions where you need to ensure that there are _’s in the download URL.

variants: Dict[str, Tuple[Variant, Spec]]
property version
classmethod version_urls() Dict[StandardVersion, str][source]

Dict of explicitly defined URLs for versions of this package.

Returns:

An dict mapping version to url, ordered by version.

A version’s URL only appears in the result if it has an an explicitly defined url argument. So, this list may be empty if a package only defines url at the top level.

versions: dict
view()[source]

Create a view with the prefix of this package as the root. Extensions added to this view will modify the installation prefix of this package.

virtual = False

By default, packages are not virtual Virtual packages override this attribute

property virtuals_provided

virtual packages provided by this package with its spec

exception spack.package_base.PackageError(message, long_msg=None)[source]

Bases: SpackError

Raised when something is wrong with a package definition.

class spack.package_base.PackageMeta(name, bases, attr_dict)[source]

Bases: PhaseCallbacksMeta, DetectablePackageMeta, DirectiveMeta, MultiMethodMeta

Package metaclass for supporting directives (e.g., depends_on) and phases

exception spack.package_base.PackageStillNeededError(spec, dependents)[source]

Bases: InstallError

Raised when package is still needed by another on uninstall.

class spack.package_base.PackageViewMixin[source]

Bases: object

This collects all functionality related to adding installed Spack package to views. Packages can customize how they are added to views by overriding these functions.

add_files_to_view(view, merge_map, skip_if_exists=True)[source]

Given a map of package files to destination paths in the view, add the files to the view. By default this adds all files. Alternative implementations may skip some files, for example if other packages linked into the view already include the file.

Parameters:
  • view (spack.filesystem_view.FilesystemView) – the view that’s updated

  • merge_map (dict) – maps absolute source paths to absolute dest paths for all files in from this package.

  • skip_if_exists (bool) – when True, don’t link files in view when they already exist. When False, always link files, without checking if they already exist.

remove_files_from_view(view, merge_map)[source]

Given a map of package files to files currently linked in the view, remove the files from the view. The default implementation removes all files. Alternative implementations may not remove all files. For example if two packages include the same file, it should only be removed when both packages are removed.

view_destination(view)[source]

The target root directory: each file is added relative to this directory.

view_file_conflicts(view, merge_map)[source]

Report any files which prevent adding this package to the view. The default implementation looks for any files which already exist. Alternative implementations may allow some of the files to exist in the view (in this case they would be omitted from the results).

view_source()[source]

The source root directory that will be added to the view: files are added such that their path relative to the view destination matches their path relative to the view source.

class spack.package_base.WindowsRPath[source]

Bases: object

Collection of functionality surrounding Windows RPATH specific features

This is essentially meaningless for all other platforms due to their use of RPATH. All methods within this class are no-ops on non Windows. Packages can customize and manipulate this class as they would a genuine RPATH, i.e. adding directories that contain runtime library dependencies

win_add_library_dependent()[source]

Return extra set of directories that require linking for package

This method should be overridden by packages that produce binaries/libraries/python extension modules/etc that are installed into directories outside a package’s bin, lib, and lib64 directories, but still require linking against one of the packages dependencies, or other components of the package itself. No-op otherwise.

Returns:

List of additional directories that require linking

win_add_rpath()[source]

Return extra set of rpaths for package

This method should be overridden by packages needing to include additional paths to be searched by rpath. No-op otherwise

Returns:

List of additional rpaths

windows_establish_runtime_linkage()[source]

Establish RPATH on Windows

Performs symlinking to incorporate rpath dependencies to Windows runtime search paths

spack.package_base.build_system_flags(name: str, flags: Iterable[str]) Tuple[Iterable[str] | None, Iterable[str] | None, Iterable[str] | None]

flag_handler that passes flags to the build system arguments. Any package using build_system_flags must also implement flags_to_build_system_args, or derive from a class that implements it. Currently, AutotoolsPackage and CMakePackage implement it.

spack.package_base.deprecated_version(pkg: PackageBase, version: str | StandardVersion) bool[source]

Return True iff the version is deprecated.

Parameters:
  • pkg – The package whose version is to be checked.

  • version – The version being checked

spack.package_base.detectable_packages = {}

Registers which are the detectable packages, by repo and package name Need a pass of package repositories to be filled.

spack.package_base.env_flags(name: str, flags: Iterable[str]) Tuple[Iterable[str] | None, Iterable[str] | None, Iterable[str] | None]

flag_handler that adds all flags to canonical environment variables.

spack.package_base.flatten_dependencies(spec, flat_dir)[source]

Make each dependency of spec present in dir via symlink.

spack.package_base.inject_flags(name: str, flags: Iterable[str]) Tuple[Iterable[str] | None, Iterable[str] | None, Iterable[str] | None]

flag_handler that injects all flags through the compiler wrapper.

Execute a dummy install and flatten dependencies.

This routine can be used in a package.py definition by setting install = install_dependency_symlinks.

This feature comes in handy for creating a common location for the the installation of third-party libraries.

spack.package_base.on_package_attributes(**attr_dict)[source]

Decorator: executes instance function only if object has attr valuses.

Executes the decorated method only if at the moment of calling the instance has attributes that are equal to certain values.

Parameters:

attr_dict (dict) – dictionary mapping attribute names to their required values

spack.package_base.possible_dependencies(*pkg_or_spec: str | Spec | Type[PackageBase], transitive: bool = True, expand_virtuals: bool = True, depflag: int = 15, missing: dict | None = None, virtuals: set | None = None) Dict[str, Set[str]][source]

Get the possible dependencies of a number of packages.

See PackageBase.possible_dependencies for details.

spack.package_base.preferred_version(pkg: PackageBase)[source]

Returns a sorted list of the preferred versions of the package.

Parameters:

pkg – The package whose versions are to be assessed.

spack.package_base.spack_times_log = 'install_times.json'

Filename of json with total build and phase times (seconds)

spack.package_base.use_cray_compiler_names()[source]

Compiler names for builds that rely on cray compiler names.

spack.package_prefs module

class spack.package_prefs.PackagePrefs(pkgname, component, vpkg=None, all=True)[source]

Bases: object

Defines the sort order for a set of specs.

Spack’s package preference implementation uses PackagePrefss to define sort order. The PackagePrefs class looks at Spack’s packages.yaml configuration and, when called on a spec, returns a key that can be used to sort that spec in order of the user’s preferences.

You can use it like this:

# key function sorts CompilerSpecs for mpich in order of preference kf = PackagePrefs(‘mpich’, ‘compiler’) compiler_list.sort(key=kf)

Or like this:

# key function to sort VersionLists for OpenMPI in order of preference. kf = PackagePrefs(‘openmpi’, ‘version’) version_list.sort(key=kf)

Optionally, you can sort in order of preferred virtual dependency providers. To do that, provide ‘providers’ and a third argument denoting the virtual package (e.g., mpi):

kf = PackagePrefs(‘trilinos’, ‘providers’, ‘mpi’) provider_spec_list.sort(key=kf)

classmethod has_preferred_providers(pkgname, vpkg)[source]

Whether specific package has a preferred vpkg providers.

classmethod has_preferred_targets(pkg_name)[source]

Whether specific package has a preferred vpkg providers.

classmethod order_for_package(pkgname, component, vpkg=None, all=True)[source]

Given a package name, sort component (e.g, version, compiler, …), and an optional vpkg, return the list from the packages config.

classmethod preferred_variants(pkg_name)[source]

Return a VariantMap of preferred variants/values for a spec.

exception spack.package_prefs.VirtualInPackagesYAMLError(message, long_message=None)[source]

Bases: SpackError

Raised when a disallowed virtual is found in packages.yaml

spack.package_prefs.get_package_dir_permissions(spec)[source]

Return the permissions configured for the spec.

Include the GID bit if group permissions are on. This makes the group attribute sticky for the directory. Package-specific settings take precedent over settings for all

spack.package_prefs.get_package_group(spec)[source]

Return the unix group associated with the spec.

Package-specific settings take precedence over settings for all

spack.package_prefs.get_package_permissions(spec)[source]

Return the permissions configured for the spec.

Package-specific settings take precedence over settings for all

spack.package_prefs.is_spec_buildable(spec)[source]

Return true if the spec is configured as buildable

spack.package_prefs.spec_externals(spec)[source]

Return a list of external specs (w/external directory path filled in), one for each known external installation.

spack.package_test module

spack.package_test.compare_output(current_output, blessed_output)[source]

Compare blessed and current output of executables.

spack.package_test.compare_output_file(current_output, blessed_output_file)[source]

Same as above, but when the blessed output is given as a file.

spack.package_test.compile_c_and_execute(source_file, include_flags, link_flags)[source]

Compile C @p source_file with @p include_flags and @p link_flags, run and return the output.

spack.parser module

Parser for spec literals

Here is the EBNF grammar for a spec:

spec          = [name] [node_options] { ^[edge_properties] node } |
                [name] [node_options] hash |
                filename

node          =  name [node_options] |
                 [name] [node_options] hash |
                 filename

node_options    = [@(version_list|version_pair)] [%compiler] { variant }
edge_properties = [ { bool_variant | key_value } ]

hash          = / id
filename      = (.|/|[a-zA-Z0-9-_]*/)([a-zA-Z0-9-_./]*)(.json|.yaml)

name          = id | namespace id
namespace     = { id . }

variant       = bool_variant | key_value | propagated_bv | propagated_kv
bool_variant  =  +id |  ~id |  -id
propagated_bv = ++id | ~~id | --id
key_value     =  id=id |  id=quoted_id
propagated_kv = id==id | id==quoted_id

compiler      = id [@version_list]

version_pair  = git_version=vid
version_list  = (version|version_range) [ { , (version|version_range)} ]
version_range = vid:vid | vid: | :vid | :
version       = vid

git_version   = git.(vid) | git_hash
git_hash      = [A-Fa-f0-9]{40}

quoted_id     = " id_with_ws " | ' id_with_ws '
id_with_ws    = [a-zA-Z0-9_][a-zA-Z_0-9-.\s]*
vid           = [a-zA-Z0-9_][a-zA-Z_0-9-.]*
id            = [a-zA-Z0-9_][a-zA-Z_0-9-]*

Identifiers using the <name>=<value> command, such as architectures and compiler flags, require a space before the name.

There is one context-sensitive part: ids in versions may contain ‘.’, while other ids may not.

There is one ambiguity: since ‘-’ is allowed in an id, you need to put whitespace space before -variant for it to be tokenized properly. You can either use whitespace, or you can just use ~variant since it means the same thing. Spack uses ~variant in directory names and in the canonical form of specs to avoid ambiguity. Both are provided because ~ can cause shell expansion when it is the first character in an id typed on the command line.

spack.parser.ALL_TOKENS = re.compile('(?P<START_EDGE_PROPERTIES>(?:\\^\\[))|(?P<END_EDGE_PROPERTIES>(?:\\]))|(?P<DEPENDENCY>(?:\\^))|(?P<VERSION_HASH_PAIR>(?:@(?:(?:(?:git\\.(?:(?:[a-zA-Z_0-9][a-zA-Z_0-9./\\-]*)))|(?:(?:[A-Fa-f0-9]{40})))

Regex to scan a valid text

spack.parser.ANALYSIS_REGEX = re.compile('(?P<START_EDGE_PROPERTIES>(?:\\^\\[))|(?P<END_EDGE_PROPERTIES>(?:\\]))|(?P<DEPENDENCY>(?:\\^))|(?P<VERSION_HASH_PAIR>(?:@(?:(?:(?:git\\.(?:(?:[a-zA-Z_0-9][a-zA-Z_0-9./\\-]*)))|(?:(?:[A-Fa-f0-9]{40})))

Regex to analyze an invalid text

spack.parser.ERROR_HANDLING_REGEXES = ['(?P<START_EDGE_PROPERTIES>(?:\\^\\[))', '(?P<END_EDGE_PROPERTIES>(?:\\]))', '(?P<DEPENDENCY>(?:\\^))', '(?P<VERSION_HASH_PAIR>(?:@(?:(?:(?:git\\.(?:(?:[a-zA-Z_0-9][a-zA-Z_0-9./\\-]*)))|(?:(?:[A-Fa-f0-9]{40}))))=(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))))', '(?P<GIT_VERSION>@(?:(?:(?:git\\.(?:(?:[a-zA-Z_0-9][a-zA-Z_0-9./\\-]*)))|(?:(?:[A-Fa-f0-9]{40})))))', '(?P<VERSION>(?:@\\s*(?:(?:(?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=))?)|=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))(?:\\s*,\\s*(?:(?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=))?)|=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)))*)))', '(?P<PROPAGATED_BOOL_VARIANT>(?:(?:\\+\\+|~~|--)\\s*[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*))', '(?P<BOOL_VARIANT>(?:[~+-]\\s*[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*))', '(?P<PROPAGATED_KEY_VALUE_PAIR>(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*==(?:(?:[a-zA-Z_0-9\\-+\\*.,:=\\~\\/\\\\]+)|(?:\'(?:[^\']|(?<=\\\\)\')*\'|\\"(?:[^\\"]|(?<=\\\\)\\")*\\"))))', '(?P<KEY_VALUE_PAIR>(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*=(?:(?:[a-zA-Z_0-9\\-+\\*.,:=\\~\\/\\\\]+)|(?:\'(?:[^\']|(?<=\\\\)\')*\'|\\"(?:[^\\"]|(?<=\\\\)\\")*\\"))))', '(?P<COMPILER_AND_VERSION>(?:%\\s*(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*)(?:[\\s]*)@\\s*(?:(?:(?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=))?)|=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))(?:\\s*,\\s*(?:(?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=))?)|=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)))*)))', '(?P<COMPILER>(?:%\\s*(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*)))', '(?P<FILENAME>(?:(?:\\.|\\/|[a-zA-Z0-9-_]*\\/)(?:[a-zA-Z0-9-_\\.\\/]*)(?:\\.json|\\.yaml)))', '(?P<FULLY_QUALIFIED_PACKAGE_NAME>(?:(?:(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-]*)(?:\\.(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-]*))+)))', '(?P<UNQUALIFIED_PACKAGE_NAME>(?:(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-]*)))', '(?P<DAG_HASH>(?:/(?:[a-zA-Z_0-9]+)))', '(?P<WS>(?:\\s+))', '(?P<UNEXPECTED>(?:.[\\s]*))']

List of all valid regexes followed by error analysis regexes

class spack.parser.EdgeAttributeParser(ctx, literal_str)[source]

Bases: object

ctx
literal_str
parse()[source]
class spack.parser.ErrorTokenType(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]

Bases: TokenBase

Enum with regexes for error analysis

UNEXPECTED = 1
class spack.parser.FileParser(ctx)[source]

Bases: object

Parse a single spec from a JSON or YAML file

ctx
parse(initial_spec: Spec) Spec[source]

Parse a spec tree from a specfile.

Parameters:

initial_spec – object where to parse the spec

Return

The initial_spec passed as argument, once constructed

spack.parser.GIT_REF = '(?:[a-zA-Z_0-9][a-zA-Z_0-9./\\-]*)'

Git refs include branch names, and can contain “.” and “/”

spack.parser.IDENTIFIER = '(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-]*)'

Valid name for specs and variants. Here we are not using the previous “w[w.-]*” since that would match most characters that can be part of a word in any language

spack.parser.NO_QUOTES_NEEDED = re.compile('^[a-zA-Z0-9,/_.-]+$')

Variant/flag values that match this can be left unquoted in Spack output

spack.parser.QUOTED_VALUE = '(?:\'(?:[^\']|(?<=\\\\)\')*\'|\\"(?:[^\\"]|(?<=\\\\)\\")*\\")'

Quoted values can be anything in between quotes, including escaped quotes.

spack.parser.SPLIT_KVP = re.compile('^([a-zA-Z_0-9][a-zA-Z_0-9\\-.]*)(==?)(.*)$')

Regex with groups to use for splitting (optionally propagated) key-value pairs

spack.parser.STRIP_QUOTES = re.compile('^([\'\\"])(.*)\\1$')

Regex to strip quotes. Group 2 will be the unquoted string.

class spack.parser.SpecNodeParser(ctx)[source]

Bases: object

Parse a single spec node from a stream of tokens

ctx
has_compiler
has_version
parse(initial_spec: Spec | None = None) Spec | None[source]

Parse a single spec node from a stream of tokens

Parameters:

initial_spec – object to be constructed

Return

The object passed as argument

class spack.parser.SpecParser(literal_str: str)[source]

Bases: object

Parse text into specs

all_specs() List[Spec][source]

Return all the specs that remain to be parsed

ctx
literal_str
next_spec(initial_spec: Spec | None = None) Spec | None[source]

Return the next spec parsed from text.

Parameters:

initial_spec – object where to parse the spec. If None a new one will be created.

Return

The spec that was parsed

tokens() List[Token][source]

Return the entire list of token from the initial text. White spaces are filtered out.

exception spack.parser.SpecParsingError(message, token, text)[source]

Bases: SpecSyntaxError

Error when parsing tokens

exception spack.parser.SpecSyntaxError[source]

Bases: Exception

Base class for Spec syntax errors

exception spack.parser.SpecTokenizationError(matches, text)[source]

Bases: SpecSyntaxError

Syntax error in a spec string

spack.parser.TOKEN_REGEXES = ['(?P<START_EDGE_PROPERTIES>(?:\\^\\[))', '(?P<END_EDGE_PROPERTIES>(?:\\]))', '(?P<DEPENDENCY>(?:\\^))', '(?P<VERSION_HASH_PAIR>(?:@(?:(?:(?:git\\.(?:(?:[a-zA-Z_0-9][a-zA-Z_0-9./\\-]*)))|(?:(?:[A-Fa-f0-9]{40}))))=(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))))', '(?P<GIT_VERSION>@(?:(?:(?:git\\.(?:(?:[a-zA-Z_0-9][a-zA-Z_0-9./\\-]*)))|(?:(?:[A-Fa-f0-9]{40})))))', '(?P<VERSION>(?:@\\s*(?:(?:(?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=))?)|=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))(?:\\s*,\\s*(?:(?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=))?)|=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)))*)))', '(?P<PROPAGATED_BOOL_VARIANT>(?:(?:\\+\\+|~~|--)\\s*[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*))', '(?P<BOOL_VARIANT>(?:[~+-]\\s*[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*))', '(?P<PROPAGATED_KEY_VALUE_PAIR>(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*==(?:(?:[a-zA-Z_0-9\\-+\\*.,:=\\~\\/\\\\]+)|(?:\'(?:[^\']|(?<=\\\\)\')*\'|\\"(?:[^\\"]|(?<=\\\\)\\")*\\"))))', '(?P<KEY_VALUE_PAIR>(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*=(?:(?:[a-zA-Z_0-9\\-+\\*.,:=\\~\\/\\\\]+)|(?:\'(?:[^\']|(?<=\\\\)\')*\'|\\"(?:[^\\"]|(?<=\\\\)\\")*\\"))))', '(?P<COMPILER_AND_VERSION>(?:%\\s*(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*)(?:[\\s]*)@\\s*(?:(?:(?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=))?)|=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))(?:\\s*,\\s*(?:(?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))?:(?:=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=))?)|=?(?:[a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)))*)))', '(?P<COMPILER>(?:%\\s*(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*)))', '(?P<FILENAME>(?:(?:\\.|\\/|[a-zA-Z0-9-_]*\\/)(?:[a-zA-Z0-9-_\\.\\/]*)(?:\\.json|\\.yaml)))', '(?P<FULLY_QUALIFIED_PACKAGE_NAME>(?:(?:(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-]*)(?:\\.(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-]*))+)))', '(?P<UNQUALIFIED_PACKAGE_NAME>(?:(?:[a-zA-Z_0-9][a-zA-Z_0-9\\-]*)))', '(?P<DAG_HASH>(?:/(?:[a-zA-Z_0-9]+)))', '(?P<WS>(?:\\s+))']

List of all the regexes used to match spec parts, in order of precedence

class spack.parser.Token(kind: TokenBase, value: str, start: int | None = None, end: int | None = None)[source]

Bases: object

Represents tokens; generated from input by lexer and fed to parse().

end
kind
start
value
class spack.parser.TokenBase(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]

Bases: Enum

Base class for an enum type with a regex value

class spack.parser.TokenContext(token_stream: Iterator[Token])[source]

Bases: object

Token context passed around by parsers

accept(kind: TokenType)[source]

If the next token is of the specified kind, advance the stream and return True. Otherwise return False.

advance()[source]

Advance one token

current_token
expect(*kinds: TokenType)[source]
next_token
token_stream
class spack.parser.TokenType(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]

Bases: TokenBase

Enumeration of the different token kinds in the spec grammar.

Order of declaration is extremely important, since text containing specs is parsed with a single regex obtained by "|".join(...) of all the regex in the order of declaration.

BOOL_VARIANT = 8
COMPILER = 12
COMPILER_AND_VERSION = 11
DAG_HASH = 16
DEPENDENCY = 3
END_EDGE_PROPERTIES = 2
FILENAME = 13
FULLY_QUALIFIED_PACKAGE_NAME = 14
GIT_VERSION = 5
KEY_VALUE_PAIR = 10
PROPAGATED_BOOL_VARIANT = 7
PROPAGATED_KEY_VALUE_PAIR = 9
START_EDGE_PROPERTIES = 1
UNQUALIFIED_PACKAGE_NAME = 15
VERSION = 6
VERSION_HASH_PAIR = 4
WS = 17
spack.parser.VALUE = '(?:[a-zA-Z_0-9\\-+\\*.,:=\\~\\/\\\\]+)'

These are legal values that can be parsed bare, without quotes on the command line.

spack.parser.parse(text: str) List[Spec][source]

Parse text into a list of strings

Parameters:

text (str) – text to be parsed

Returns:

List of specs

spack.parser.parse_one_or_raise(text: str, initial_spec: Spec | None = None) Spec[source]

Parse exactly one spec from text and return it, or raise

Parameters:
  • text (str) – text to be parsed

  • initial_spec – buffer where to parse the spec. If None a new one will be created.

spack.parser.quote_if_needed(value: str) str[source]

Add quotes around the value if it requires quotes.

This will add quotes around the value unless it matches NO_QUOTES_NEEDED.

This adds: * single quotes by default * double quotes around any value that contains single quotes

If double quotes are used, we json-escpae the string. That is, we escape \, ", and control codes.

spack.parser.strip_quotes_and_unescape(string: str) str[source]

Remove surrounding single or double quotes from string, if present.

spack.parser.tokenize(text: str) Iterator[Token][source]

Return a token generator from the text passed as input.

Raises:

SpecTokenizationError – if we can’t tokenize anymore, but didn’t reach the end of the input text.

spack.patch module

class spack.patch.FilePatch(pkg: PackageBase, relative_path: str, level: int, working_dir: str, reverse: bool = False, ordering_key: Tuple[str, int] | None = None)[source]

Bases: Patch

Describes a patch that is retrieved from a file in the repository.

property sha256: str

Get the patch checksum.

Returns:

The sha256 of the patch file.

to_dict() Dict[str, Any][source]

Dictionary representation of the patch.

Returns:

A dictionary representation.

exception spack.patch.NoSuchPatchError(message, long_message=None)[source]

Bases: SpackError

Raised when a patch file doesn’t exist.

class spack.patch.Patch(pkg: PackageBase, path_or_url: str, level: int, working_dir: str, reverse: bool = False)[source]

Bases: object

Base class for patches.

The owning package is not necessarily the package to apply the patch to – in the case where a dependent package patches its dependency, it is the dependent’s fullname.

apply(stage: Stage) None[source]

Apply a patch to source in a stage.

Parameters:

stage – stage where source code lives

sha256: str
to_dict() Dict[str, Any][source]

Dictionary representation of the patch.

Returns:

A dictionary representation.

class spack.patch.PatchCache(repository: RepoPath, data: Dict[str, Any] | None = None)[source]

Bases: object

Index of patches used in a repository, by sha256 hash.

This allows us to look up patches without loading all packages. It’s also needed to properly implement dependency patching, as need a way to look up patches that come from packages not in the Spec sub-DAG.

The patch index is structured like this in a file (this is YAML, but we write JSON):

patches:
    sha256:
        namespace1.package1:
            <patch json>
        namespace2.package2:
            <patch json>
        ... etc. ...
classmethod from_json(stream: Any, repository: RepoPath) PatchCache[source]

Initialize a new PatchCache instance from JSON.

Parameters:
  • stream – stream of data

  • repository – repository containing package

Returns:

A new PatchCache instance.

patch_for_package(sha256: str, pkg: PackageBase) Patch[source]

Look up a patch in the index and build a patch object for it.

We build patch objects lazily because building them requires that we have information about the package’s location in its repo.

Parameters:
  • sha256 – sha256 hash to look up

  • pkg – Package object to get patch for.

Returns:

The patch object.

to_json(stream: Any) None[source]

Dump a JSON representation to a stream.

Parameters:

stream – stream of data

update(other: PatchCache) None[source]

Update this cache with the contents of another.

Parameters:

other – another patch cache to merge

update_package(pkg_fullname: str) None[source]

Update the patch cache.

Parameters:

pkg_fullname – package to update.

exception spack.patch.PatchDirectiveError(message, long_message=None)[source]

Bases: SpackError

Raised when the wrong arguments are suppled to the patch directive.

exception spack.patch.PatchLookupError(message, long_message=None)[source]

Bases: NoSuchPatchError

Raised when a patch file cannot be located from sha256.

class spack.patch.UrlPatch(pkg: PackageBase, url: str, level: int = 1, *, working_dir: str = '.', reverse: bool = False, sha256: str, ordering_key: Tuple[str, int] | None = None, archive_sha256: str | None = None)[source]

Bases: Patch

Describes a patch that is retrieved from a URL.

apply(stage: Stage) None[source]

Apply a patch to source in a stage.

Parameters:

stage – stage where source code lives

property stage: Stage

The stage in which to download (and unpack) the URL patch.

Returns:

The stage object.

to_dict() Dict[str, Any][source]

Dictionary representation of the patch.

Returns:

A dictionary representation.

spack.patch.apply_patch(stage: Stage, patch_path: str, level: int = 1, working_dir: str = '.', reverse: bool = False) None[source]

Apply the patch at patch_path to code in the stage.

Parameters:
  • stage – stage with code that will be patched

  • patch_path – filesystem location for the patch to apply

  • level – patch level

  • working_dir – relative path within the stage to change to

  • reverse – reverse the patch

spack.patch.from_dict(dictionary: Dict[str, Any], repository: RepoPath | None = None) Patch[source]

Create a patch from json dictionary.

Parameters:
  • dictionary – dictionary representation of a patch

  • repository – repository containing package

Returns:

A patch object.

Raises:

ValueError – If owner or url/relative_path are missing in the dictionary.

spack.paths module

Defines paths that are part of Spack’s directory structure.

Do not import other spack modules here. This module is used throughout Spack and should bring in a minimal number of external dependencies.

spack.paths.bin_path = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/latest/lib/spack/docs/_spack_root/bin'

bin directory in the spack prefix

spack.paths.default_misc_cache_path = '/home/docs/.spack/cache'

transient caches for Spack data (virtual cache, patch sha256 lookup, etc.)

spack.paths.default_monitor_path = '/home/docs/.spack/reports/monitor'

spack monitor analysis directories

spack.paths.default_test_path = '/home/docs/.spack/test'

installation test (spack test) output

spack.paths.default_user_bootstrap_path = '/home/docs/.spack/bootstrap'

bootstrap store for bootstrapping clingo and other tools

spack.paths.prefix = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/latest/lib/spack/docs/_spack_root'

This file lives in $prefix/lib/spack/spack/__file__

spack.paths.reports_path = '/home/docs/.spack/reports'

junit, cdash, etc. reports about builds

spack.paths.sbang_script = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/latest/lib/spack/docs/_spack_root/bin/sbang'

The sbang script in the spack installation

spack.paths.spack_root = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/latest/lib/spack/docs/_spack_root'

synonym for prefix

spack.paths.spack_script = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/latest/lib/spack/docs/_spack_root/bin/spack'

The spack script itself

spack.paths.system_config_path = '/etc/spack'

System configuration location

spack.paths.user_config_path = '/home/docs/.spack'

User configuration location

spack.paths.user_repos_cache_path = '/home/docs/.spack/git_repos'

git repositories fetched to compare commits to versions

spack.projections module

spack.projections.get_projection(projections, spec)[source]

Get the projection for a spec from a projections dict.

spack.provider_index module

Classes and functions to manage providers of virtual dependencies

class spack.provider_index.ProviderIndex(repository: Repo | RepoPath, specs: List[Spec] | None = None, restrict: bool = False)[source]

Bases: _IndexBase

copy()[source]

Return a deep copy of this index.

static from_json(stream, repository)[source]

Construct a provider index from its JSON representation.

Parameters:

stream – stream where to read from the JSON data

merge(other)[source]

Merge another provider index into this one.

Parameters:

other (ProviderIndex) – provider index to be merged

remove_provider(pkg_name)[source]

Remove a provider from the ProviderIndex.

to_json(stream=None)[source]

Dump a JSON representation of this object.

Parameters:

stream – stream where to dump

update(spec)[source]

Update the provider index with additional virtual specs.

Parameters:

spec – spec potentially providing additional virtual specs

exception spack.provider_index.ProviderIndexError(message, long_message=None)[source]

Bases: SpackError

Raised when there is a problem with a ProviderIndex.

spack.relocate module

exception spack.relocate.InstallRootStringError(file_path, root_path)[source]

Bases: SpackError

spack.relocate.fixup_macos_rpath(root, filename)[source]

Apply rpath fixups to the given file.

Parameters:
  • root – absolute path to the parent directory

  • filename – relative path to the library or binary

Returns:

True if fixups were applied, else False

spack.relocate.fixup_macos_rpaths(spec)[source]

Remove duplicate and nonexistent rpaths.

Some autotools packages write their own -rpath entries in addition to those implicitly added by the Spack compiler wrappers. On Linux these duplicate rpaths are eliminated, but on macOS they result in multiple entries which makes it harder to adjust with install_name_tool -delete_rpath.

spack.relocate.is_binary(filename)[source]

Returns true if a file is binary, False otherwise

Parameters:

filename – file to be tested

Returns:

True or False

spack.relocate.macho_find_paths(orig_rpaths, deps, idpath, old_layout_root, prefix_to_prefix)[source]

Inputs original rpaths from mach-o binaries dependency libraries for mach-o binaries id path of mach-o libraries old install directory layout root prefix_to_prefix dictionary which maps prefixes in the old directory layout to directories in the new directory layout Output paths_to_paths dictionary which maps all of the old paths to new paths

spack.relocate.macho_make_paths_normal(orig_path_name, rpaths, deps, idpath)[source]

Return a dictionary mapping the relativized rpaths to the original rpaths. This dictionary is used to replace paths in mach-o binaries. Replace @loader_path’ with the dirname of the origname path name in rpaths and deps; idpath is replaced with the original path name

spack.relocate.macho_make_paths_relative(path_name, old_layout_root, rpaths, deps, idpath)[source]

Return a dictionary mapping the original rpaths to the relativized rpaths. This dictionary is used to replace paths in mach-o binaries. Replace old_dir with relative path from dirname of path name in rpaths and deps; idpath is replaced with @rpath/libname.

spack.relocate.macholib_get_paths(cur_path)[source]

Get rpaths, dependent libraries, and library id of mach-o objects.

spack.relocate.make_elf_binaries_relative(new_binaries, orig_binaries, orig_layout_root)[source]

Replace the original RPATHs in the new binaries making them relative to the original layout root.

Parameters:
  • new_binaries (list) – new binaries whose RPATHs is to be made relative

  • orig_binaries (list) – original binaries

  • orig_layout_root (str) – path to be used as a base for making RPATHs relative

Compute the relative target from the original link and make the new link relative.

Parameters:
  • new_links (list) – new links to be made relative

  • orig_links (list) – original links

spack.relocate.make_macho_binaries_relative(cur_path_names, orig_path_names, old_layout_root)[source]

Replace old RPATHs with paths relative to old_dir in binary files

spack.relocate.modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths)[source]

This function is used to make machO buildcaches on macOS by replacing old paths with new paths using install_name_tool Inputs: mach-o binary to be modified original rpaths original dependency paths original id path if a mach-o library dictionary mapping paths in old install layout to new install layout

spack.relocate.modify_object_macholib(cur_path, paths_to_paths)[source]

This function is used when install machO buildcaches on linux by rewriting mach-o loader commands for dependency library paths of mach-o binaries and the id path for mach-o libraries. Rewritting of rpaths is handled by replace_prefix_bin. Inputs mach-o binary to be modified dictionary mapping paths in old install layout to new install layout

spack.relocate.needs_binary_relocation(m_type, m_subtype)[source]

Returns True if the file with MIME type/subtype passed as arguments needs binary relocation, False otherwise.

Parameters:
  • m_type (str) – MIME type of the file

  • m_subtype (str) – MIME subtype of the file

spack.relocate.needs_text_relocation(m_type, m_subtype)[source]

Returns True if the file with MIME type/subtype passed as arguments needs text relocation, False otherwise.

Parameters:
  • m_type (str) – MIME type of the file

  • m_subtype (str) – MIME subtype of the file

spack.relocate.new_relocate_elf_binaries(binaries, prefix_to_prefix)[source]

Take a list of binaries, and an ordered dictionary of prefix to prefix mapping, and update the rpaths accordingly.

spack.relocate.relocate_elf_binaries(binaries, orig_root, new_root, new_prefixes, rel, orig_prefix, new_prefix)[source]

Relocate the binaries passed as arguments by changing their RPATHs.

Use patchelf to get the original RPATHs and then replace them with rpaths in the new directory layout.

New RPATHs are determined from a dictionary mapping the prefixes in the old directory layout to the prefixes in the new directory layout if the rpath was in the old layout root, i.e. system paths are not replaced.

Parameters:
  • binaries (list) – list of binaries that might need relocation, located in the new prefix

  • orig_root (str) – original root to be substituted

  • new_root (str) – new root to be used, only relevant for relative RPATHs

  • new_prefixes (dict) – dictionary that maps the original prefixes to where they should be relocated

  • rel (bool) – True if the RPATHs are relative, False if they are absolute

  • orig_prefix (str) – prefix where the executable was originally located

  • new_prefix (str) – prefix where we want to relocate the executable

Relocate links to a new install prefix.

spack.relocate.relocate_macho_binaries(path_names, old_layout_root, new_layout_root, prefix_to_prefix, rel, old_prefix, new_prefix)[source]

Use macholib python package to get the rpaths, depedent libraries and library identity for libraries from the MachO object. Modify them with the replacement paths queried from the dictionary mapping old layout prefixes to hashes and the dictionary mapping hashes to the new layout prefixes.

spack.relocate.relocate_text(files, prefixes)[source]

Relocate text file from the original installation prefix to the new prefix.

Relocation also affects the the path in Spack’s sbang script.

Parameters:
  • files (list) – Text files to be relocated

  • prefixes (OrderedDict) – String prefixes which need to be changed

spack.relocate.relocate_text_bin(binaries, prefixes)[source]

Replace null terminated path strings hard-coded into binaries.

The new install prefix must be shorter than the original one.

Parameters:
  • binaries (list) – binaries to be relocated

  • prefixes (OrderedDict) – String prefixes which need to be changed.

Raises:

spack.relocate_text.BinaryTextReplaceError – when the new path is longer than the old path

spack.relocate_text module

This module contains pure-Python classes and functions for replacing paths inside text files and binaries.

class spack.relocate_text.BinaryFilePrefixReplacer(prefix_to_prefix, suffix_safety_size=7)[source]

Bases: PrefixReplacer

classmethod binary_text_regex(binary_prefixes, suffix_safety_size=7)[source]

Create a regex that looks for exact matches of prefixes, and also tries to match a C-string type null terminator in a small lookahead window.

Parameters:
  • binary_prefixes (list) – List of byte strings of prefixes to match

  • suffix_safety_size (int) – Sizeof the lookahed for null-terminated string.

Returns: compiled regex

classmethod from_strings_or_bytes(prefix_to_prefix: Dict[str | bytes, str | bytes], suffix_safety_size: int = 7) BinaryFilePrefixReplacer[source]

Create a BinaryFilePrefixReplacer from an ordered prefix to prefix map.

Parameters:
  • prefix_to_prefix (OrderedDict) – Ordered mapping of prefix to prefix.

  • suffix_safety_size (int) – Number of bytes to retain at the end of a C-string to avoid binary string-aliasing issues.

exception spack.relocate_text.BinaryStringReplacementError(file_path, old_len, new_len)[source]

Bases: SpackError

exception spack.relocate_text.BinaryTextReplaceError(msg)[source]

Bases: SpackError

exception spack.relocate_text.CannotGrowString(old, new)[source]

Bases: BinaryTextReplaceError

exception spack.relocate_text.CannotShrinkCString(old, new, full_old_string)[source]

Bases: BinaryTextReplaceError

class spack.relocate_text.PrefixReplacer(prefix_to_prefix: Dict[bytes, bytes])[source]

Bases: object

Base class for applying a prefix to prefix map to a list of binaries or text files. Child classes implement _apply_to_file to do the actual work, which is different when it comes to binaries and text files.

apply(filenames: list)[source]

Returns a list of files that were modified

apply_to_file(f)[source]
apply_to_filename(filename)[source]
property is_noop: bool

Returns true when the prefix to prefix map is mapping everything to the same location (identity) or there are no prefixes to replace.

class spack.relocate_text.TextFilePrefixReplacer(prefix_to_prefix: Dict[bytes, bytes])[source]

Bases: PrefixReplacer

This class applies prefix to prefix mappings for relocation on text files.

Note that UTF-8 encoding is assumed.

classmethod from_strings_or_bytes(prefix_to_prefix: Dict[str | bytes, str | bytes]) TextFilePrefixReplacer[source]

Create a TextFilePrefixReplacer from an ordered prefix to prefix map.

spack.relocate_text.encode_path(p: str | bytes) bytes[source]
spack.relocate_text.filter_identity_mappings(prefix_to_prefix)[source]

Drop mappings that are not changed.

spack.relocate_text.utf8_path_to_binary_regex(prefix: str)[source]

Create a binary regex that matches the input path in utf8

spack.relocate_text.utf8_paths_to_single_binary_regex(prefixes)[source]

Create a (binary) regex that matches any input path in utf8

spack.repo module

exception spack.repo.BadRepoError(message, long_message=None)[source]

Bases: RepoError

Raised when repo layout is invalid.

exception spack.repo.FailedConstructorError(name, exc_type, exc_obj, exc_tb)[source]

Bases: RepoError

Raised when a package’s class constructor fails.

class spack.repo.FastPackageChecker(packages_path)[source]

Bases: Mapping

Cache that maps package names to the stats obtained on the ‘package.py’ files associated with them.

For each repository a cache is maintained at class level, and shared among all instances referring to it. Update of the global cache is done lazily during instance initialization.

invalidate()[source]

Regenerate cache for this checker.

last_mtime()[source]
modified_since(since: float) List[str][source]
class spack.repo.GitExe[source]

Bases: object

class spack.repo.Indexer(repository)[source]

Bases: object

Adaptor for indexes that need to be generated when repos are updated.

create()[source]
needs_update(pkg)[source]

Whether an update is needed when the package file hasn’t changed.

Returns:

True if this package needs its index

updated, False otherwise.

Return type:

(bool)

We already automatically update indexes when package files change, but other files (like patches) may change underneath the package file. This method can be used to check additional package-specific files whenever they’re loaded, to tell the RepoIndex to update the index just for that package.

abstract read(stream)[source]

Read this index from a provided file object.

abstract update(pkg_fullname)[source]

Update the index in memory with information about a package.

abstract write(stream)[source]

Write the index to a file object.

exception spack.repo.InvalidNamespaceError(message, long_message=None)[source]

Bases: RepoError

Raised when an invalid namespace is encountered.

class spack.repo.MockRepositoryBuilder(root_directory, namespace=None)[source]

Bases: object

Build a mock repository in a directory

add_package(name, dependencies=None)[source]

Create a mock package in the repository, using a Jinja2 template.

Parameters:
  • name (str) – name of the new package

  • dependencies (list) – list of (“dep_spec”, “dep_type”, “condition”) tuples. Both “dep_type” and “condition” can default to None in which case spack.dependency.default_deptype and spack.spec.Spec() are used.

recipe_filename(name)[source]
remove(name)[source]
spack.repo.NOT_PROVIDED = <object object>

Guaranteed unused default value for some functions.

exception spack.repo.NoRepoConfiguredError(message, long_message=None)[source]

Bases: RepoError

Raised when there are no repositories configured.

spack.repo.PATH: RepoPath | Singleton = <spack.repo.RepoPath object>

Singleton repo path instance

class spack.repo.PatchIndexer(repository)[source]

Bases: Indexer

Lifecycle methods for patch cache.

needs_update()[source]

Whether an update is needed when the package file hasn’t changed.

Returns:

True if this package needs its index

updated, False otherwise.

Return type:

(bool)

We already automatically update indexes when package files change, but other files (like patches) may change underneath the package file. This method can be used to check additional package-specific files whenever they’re loaded, to tell the RepoIndex to update the index just for that package.

read(stream)[source]

Read this index from a provided file object.

update(pkg_fullname)[source]

Update the index in memory with information about a package.

write(stream)[source]

Write the index to a file object.

class spack.repo.ProviderIndexer(repository)[source]

Bases: Indexer

Lifecycle methods for virtual package providers.

read(stream)[source]

Read this index from a provided file object.

update(pkg_fullname)[source]

Update the index in memory with information about a package.

write(stream)[source]

Write the index to a file object.

spack.repo.ROOT_PYTHON_NAMESPACE = 'spack.pkg'

Package modules are imported as spack.pkg.<repo-namespace>.<pkg-name>

class spack.repo.Repo(root, cache=None)[source]

Bases: object

Class representing a package repository in the filesystem.

Each package repository must have a top-level configuration file called repo.yaml.

Currently, repo.yaml this must define:

namespace:

A Python namespace where the repository’s packages should live.

all_package_classes()[source]

Iterator over all package classes in the repository.

Use this with care, because loading packages is slow.

all_package_names(include_virtuals=False)[source]

Returns a sorted list of all package names in the Repo.

all_package_paths()[source]
dirname_for_package_name(pkg_name)[source]

Get the directory name for a particular package. This is the directory that contains its package.py file.

dump_provenance(spec, path)[source]

Dump provenance information for a spec to a particular path.

This dumps the package file and any associated patch files. Raises UnknownPackageError if not found.

exists(pkg_name)[source]

Whether a package with the supplied name exists.

extensions_for(extendee_spec)[source]
filename_for_package_name(pkg_name)[source]

Get the filename for the module we should load for a particular package. Packages for a Repo live in $root/<package_name>/package.py

This will return a proper package.py path even if the package doesn’t exist yet, so callers will need to ensure the package exists before importing.

get(spec)[source]

Returns the package associated with the supplied spec.

get_pkg_class(pkg_name)[source]

Get the class for the package out of its module.

First loads (or fetches from cache) a module for the package. Then extracts the package class from the module according to Spack’s naming convention.

property index

Construct the index for this repo lazily.

is_prefix(fullname)[source]

True if fullname is a prefix of this Repo’s namespace.

is_virtual(pkg_name)[source]

Return True if the package with this name is virtual, False otherwise.

This function use the provider index. If calling from a code block that is used to construct the provider index use the is_virtual_safe function.

Parameters:

pkg_name (str) – name of the package we want to check

is_virtual_safe(pkg_name)[source]

Return True if the package with this name is virtual, False otherwise.

This function doesn’t use the provider index.

Parameters:

pkg_name (str) – name of the package we want to check

last_mtime()[source]

Time a package file in this repo was last updated.

package_path(name)[source]

Get path to package.py file for this repo.

packages_with_tags(*tags: str) Set[str][source]
partition_package_name(pkg_name: str) Tuple[str, str][source]
property patch_index

Index of patches and packages they’re defined on.

property provider_index

A provider index with names specific to this repo.

providers_for(vpkg_spec)[source]
purge()[source]

Clear entire package instance cache.

real_name(import_name)[source]

Allow users to import Spack packages using Python identifiers.

A python identifier might map to many different Spack package names due to hyphen/underscore ambiguity.

Easy example:

num3proxy -> 3proxy

Ambiguous:

foo_bar -> foo_bar, foo-bar

More ambiguous:

foo_bar_baz -> foo_bar_baz, foo-bar-baz, foo_bar-baz, foo-bar_baz

property tag_index

Index of tags and which packages they’re defined on.

exception spack.repo.RepoError(message, long_message=None)[source]

Bases: SpackError

Superclass for repository-related errors.

class spack.repo.RepoIndex(package_checker: FastPackageChecker, namespace: str, cache: FileCache)[source]

Bases: object

Container class that manages a set of Indexers for a Repo.

This class is responsible for checking packages in a repository for updates (using FastPackageChecker) and for regenerating indexes when they’re needed.

Indexers should be added to the RepoIndex using add_indexer(name, indexer), and they should support the interface defined by Indexer, so that the RepoIndex can read, generate, and update stored indices.

Generated indexes are accessed by name via __getitem__().

add_indexer(name: str, indexer: Indexer)[source]

Add an indexer to the repo index.

Parameters:
  • name – name of this indexer

  • indexer – object implementing the Indexer interface

class spack.repo.RepoLoader(fullname, repo, package_name)[source]

Bases: _PrependFileLoader

Loads a Python module associated with a package in specific repository

class spack.repo.RepoPath(*repos, **kwargs)[source]

Bases: object

A RepoPath is a list of repos that function as one.

It functions exactly like a Repo, but it operates on the combined results of the Repos in its list instead of on a single package repository.

Parameters:

repos (list) – list Repo objects or paths to put in this RepoPath

all_package_classes()[source]
all_package_names(include_virtuals=False)[source]
all_package_paths()[source]
dirname_for_package_name(pkg_name)[source]
dump_provenance(spec, path)[source]

Dump provenance information for a spec to a particular path.

This dumps the package file and any associated patch files. Raises UnknownPackageError if not found.

exists(pkg_name)[source]

Whether package with the give name exists in the path’s repos.

Note that virtual packages do not “exist”.

extensions_for(extendee_spec)[source]
filename_for_package_name(pkg_name)[source]
first_repo()[source]

Get the first repo in precedence order.

get(spec)[source]

Returns the package associated with the supplied spec.

get_pkg_class(pkg_name)[source]

Find a class for the spec’s package and return the class object.

get_repo(namespace, default=<object object>)[source]

Get a repository by namespace.

Parameters:

namespace – Look up this namespace in the RepoPath, and return it if found.

Optional Arguments:

default:

If default is provided, return it when the namespace isn’t found. If not, raise an UnknownNamespaceError.

is_virtual(pkg_name)[source]

Return True if the package with this name is virtual, False otherwise.

This function use the provider index. If calling from a code block that is used to construct the provider index use the is_virtual_safe function.

Parameters:

pkg_name (str) – name of the package we want to check

is_virtual_safe(pkg_name)[source]

Return True if the package with this name is virtual, False otherwise.

This function doesn’t use the provider index.

Parameters:

pkg_name (str) – name of the package we want to check

last_mtime()[source]

Time a package file in this repo was last updated.

package_path(name)[source]

Get path to package.py file for this repo.

packages_with_tags(*tags: str, full: bool = False) Set[str][source]

Returns a set of packages matching any of the tags in input.

Parameters:

full – if True the package names in the output are fully-qualified

property patch_index

Merged PatchIndex from all Repos in the RepoPath.

property provider_index

Merged ProviderIndex from all Repos in the RepoPath.

providers_for(vpkg_spec)[source]
put_first(repo)[source]

Add repo first in the search path.

put_last(repo)[source]

Add repo last in the search path.

remove(repo)[source]

Remove a repo from the search path.

repo_for_pkg(spec)[source]

Given a spec, get the repository for its package.

property tag_index

Merged TagIndex from all Repos in the RepoPath.

class spack.repo.ReposFinder[source]

Bases: object

MetaPathFinder class that loads a Python module corresponding to a Spack package

Return a loader based on the inspection of the current global repository list.

compute_loader(fullname)[source]
find_spec(fullname, python_path, target=None)[source]
class spack.repo.SpackNamespace(namespace)[source]

Bases: module

Allow lazy loading of modules.

class spack.repo.SpackNamespaceLoader[source]

Bases: object

create_module(spec)[source]
exec_module(module)[source]
class spack.repo.TagIndexer(repository)[source]

Bases: Indexer

Lifecycle methods for a TagIndex on a Repo.

read(stream)[source]

Read this index from a provided file object.

update(pkg_fullname)[source]

Update the index in memory with information about a package.

write(stream)[source]

Write the index to a file object.

exception spack.repo.UnknownEntityError(message, long_message=None)[source]

Bases: RepoError

Raised when we encounter a package spack doesn’t have.

exception spack.repo.UnknownNamespaceError(namespace, name=None)[source]

Bases: UnknownEntityError

Raised when we encounter an unknown namespace

exception spack.repo.UnknownPackageError(name, repo=None)[source]

Bases: UnknownEntityError

Raised when we encounter a package spack doesn’t have.

spack.repo.add_package_to_git_stage(packages)[source]

add a package to the git stage with git add

spack.repo.all_package_names(include_virtuals=False)[source]

Convenience wrapper around spack.repo.all_package_names().

spack.repo.autospec(function)[source]

Decorator that automatically converts the first argument of a function to a Spec.

spack.repo.create(configuration)[source]

Create a RepoPath from a configuration object.

Parameters:

configuration (spack.config.Configuration) – configuration object

spack.repo.create_or_construct(path, namespace=None)[source]

Create a repository, or just return a Repo if it already exists.

spack.repo.create_repo(root, namespace=None, subdir='packages')[source]

Create a new repository in root with the specified namespace.

If the namespace is not provided, use basename of root. Return the canonicalized path and namespace of the created repository.

spack.repo.diff_packages(rev1, rev2)[source]

Compute packages lists for the two revisions and return a tuple containing all the packages in rev1 but not in rev2 and all the packages in rev2 but not in rev1.

spack.repo.get_all_package_diffs(type, rev1='HEAD^1', rev2='HEAD')[source]
Show packages changed, added, or removed (or any combination of those)

since a commit.

Parameters:
  • type (str) – String containing one or more of ‘A’, ‘B’, ‘C’

  • rev1 (str) – Revision to compare against, default is ‘HEAD^’

  • rev2 (str) – Revision to compare to rev1, default is ‘HEAD’

Returns:

A set contain names of affected packages.

spack.repo.is_package_file(filename)[source]

Determine whether we are in a package file from a repo.

spack.repo.list_packages(rev)[source]

List all packages associated with the given revision

spack.repo.namespace_from_fullname(fullname)[source]

Return the repository namespace only for the full module name.

For instance:

namespace_from_fullname(‘spack.pkg.builtin.hdf5’) == ‘builtin’

Parameters:

fullname (str) – full name for the Python module

spack.repo.packages_path()[source]

Get the test repo if it is active, otherwise the builtin repo.

spack.repo.partition_package_name(pkg_name: str) Tuple[str, str][source]

Given a package name that might be fully-qualified, returns the namespace part, if present and the unqualified package name.

If the package name is unqualified, the namespace is an empty string.

Parameters:

pkg_name – a package name, either unqualified like “llvl”, or fully-qualified, like “builtin.llvm”

spack.repo.python_package_for_repo(namespace)[source]

Returns the full namespace of a repository, given its relative one

For instance:

python_package_for_repo(‘builtin’) == ‘spack.pkg.builtin’

Parameters:

namespace (str) – repo namespace

spack.repo.use_repositories(*paths_and_repos, **kwargs)[source]

Use the repositories passed as arguments within the context manager.

Parameters:
  • *paths_and_repos – paths to the repositories to be used, or already constructed Repo objects

  • override (bool) – if True use only the repositories passed as input, if False add them to the top of the list of current repositories.

Returns:

Corresponding RepoPath object

spack.report module

Tools to produce reports of spec installations

class spack.report.BuildInfoCollector(specs: List[Spec])[source]

Bases: InfoCollector

Collect information for the PackageInstaller._install_task method.

Parameters:

specs – specs whose install information will be recorded

extract_package_from_signature(instance, *args, **kwargs)[source]

Return the package instance, given the signature of the wrapped function.

fetch_log(pkg)[source]

Return the stdout log associated with the function being monitored

Parameters:

pkg – package under consideration

init_spec_record(input_spec, record)[source]

Add additional entries to a spec record when entering the collection context.

on_success(pkg, kwargs, package_record)[source]

Add additional properties on function call success.

class spack.report.InfoCollector(wrap_class: Type, do_fn: str, specs: List[Spec])[source]

Bases: object

Base class for context manager objects that collect information during the execution of certain package functions.

The data collected is available through the specs attribute once exited, and it’s organized as a list where each item represents the installation of one spec.

do_fn: str

Action to be reported on

extract_package_from_signature(instance, *args, **kwargs)[source]

Return the package instance, given the signature of the wrapped function.

fetch_log(pkg: PackageBase) str[source]

Return the stdout log associated with the function being monitored

Parameters:

pkg – package under consideration

init_spec_record(input_spec: Spec, record)[source]

Add additional entries to a spec record when entering the collection context.

input_specs: List[Spec]

Specs that will be acted on

on_success(pkg: PackageBase, kwargs, package_record)[source]

Add additional properties on function call success.

specs: List[Dict[str, Any]]

This is where we record the data that will be included in our report

wrap_class: Type

Class for which to wrap a function

class spack.report.TestInfoCollector(specs: List[Spec], record_directory: str)[source]

Bases: InfoCollector

Collect information for the PackageBase.do_test method.

Parameters:
  • specs – specs whose install information will be recorded

  • record_directory – record directory for test log paths

dir: str
extract_package_from_signature(instance, *args, **kwargs)[source]

Return the package instance, given the signature of the wrapped function.

fetch_log(pkg: PackageBase)[source]

Return the stdout log associated with the function being monitored

Parameters:

pkg – package under consideration

on_success(pkg, kwargs, package_record)[source]

Add additional properties on function call success.

spack.report.build_context_manager(reporter: Reporter, filename: str, specs: List[Spec])[source]

Decorate a package to generate a report after the installation function is executed.

Parameters:
  • reporter – object that generates the report

  • filename – filename for the report

  • specs – specs that need reporting

spack.report.test_context_manager(reporter: Reporter, filename: str, specs: List[Spec], raw_logs_dir: str)[source]

Decorate a package to generate a report after the test function is executed.

Parameters:
  • reporter – object that generates the report

  • filename – filename for the report

  • specs – specs that need reporting

  • raw_logs_dir – record directory for test log paths

spack.resource module

Describes an optional resource needed for a build.

Typically a bunch of sources that can be built in-tree within another package to enable optional features.

class spack.resource.Resource(name, fetcher, destination, placement)[source]

Bases: object

Represents an optional resource to be fetched by a package.

Aggregates a name, a fetcher, a destination and a placement.

spack.rewiring module

exception spack.rewiring.PackageNotInstalledError(spliced_spec, build_spec, dep)[source]

Bases: RewireError

Raised when the build_spec for a splice was not installed.

exception spack.rewiring.RewireError(message, long_msg=None)[source]

Bases: SpackError

Raised when something goes wrong with rewiring.

spack.rewiring.rewire(spliced_spec)[source]

Given a spliced spec, this function conducts all the rewiring on all nodes in the DAG of that spec.

spack.rewiring.rewire_node(spec, explicit)[source]

This function rewires a single node, worrying only about references to its subgraph. Binaries, text, and links are all changed in accordance with the splice. The resulting package is then ‘installed.’

spack.spec module

Spack allows very fine-grained control over how packages are installed and over how they are built and configured. To make this easy, it has its own syntax for declaring a dependence. We call a descriptor of a particular package configuration a “spec”.

The syntax looks like this:

$ spack install mpileaks ^openmpi @1.2:1.4 +debug %intel @12.1 target=zen
                0        1        2        3      4      5     6

The first part of this is the command, ‘spack install’. The rest of the line is a spec for a particular installation of the mpileaks package.

  1. The package to install

  2. A dependency of the package, prefixed by ^

  3. A version descriptor for the package. This can either be a specific version, like “1.2”, or it can be a range of versions, e.g. “1.2:1.4”. If multiple specific versions or multiple ranges are acceptable, they can be separated by commas, e.g. if a package will only build with versions 1.0, 1.2-1.4, and 1.6-1.8 of mavpich, you could say:

    depends_on(”mvapich@1.0,1.2:1.4,1.6:1.8”)

  4. A compile-time variant of the package. If you need openmpi to be built in debug mode for your package to work, you can require it by adding +debug to the openmpi spec when you depend on it. If you do NOT want the debug option to be enabled, then replace this with -debug. If you would like for the variant to be propagated through all your package’s dependencies use “++” for enabling and “–” or “~~” for disabling.

  5. The name of the compiler to build with.

  6. The versions of the compiler to build with. Note that the identifier for a compiler version is the same ‘@’ that is used for a package version. A version list denoted by ‘@’ is associated with the compiler only if if it comes immediately after the compiler name. Otherwise it will be associated with the current package spec.

  7. The architecture to build with. This is needed on machines where cross-compilation is required

exception spack.spec.AmbiguousHashError(msg, *specs)[source]

Bases: SpecError

exception spack.spec.ArchitecturePropagationError(message, long_message=None)[source]

Bases: SpecError

Raised when the double equal symbols are used to assign the spec’s architecture.

class spack.spec.CompilerSpec(*args)[source]

Bases: object

The CompilerSpec field represents the compiler or range of compiler versions that a package should be built with. CompilerSpecs have a name and a version list.

property concrete

A CompilerSpec is concrete if its versions are concrete and there is an available compiler with the right version.

constrain(other: CompilerSpec) bool[source]

Intersect self’s versions with other.

Return whether the CompilerSpec changed.

copy()[source]
property display_str

Equivalent to {compiler.name}{@compiler.version} for Specs, without extra @= for readability.

static from_dict(d)[source]
intersects(other: CompilerSpec) bool[source]

Return True if all concrete specs matching self also match other, otherwise False.

For compiler specs this means that the name of the compiler must be the same for self and other, and that the versions ranges should intersect.

Parameters:

other – spec to be satisfied

name
satisfies(other: CompilerSpec) bool[source]

Return True if all concrete specs matching self also match other, otherwise False.

For compiler specs this means that the name of the compiler must be the same for self and other, and that the version range of self is a subset of that of other.

Parameters:

other – spec to be satisfied

to_dict()[source]
property version
versions
exception spack.spec.DuplicateArchitectureError(message, long_message=None)[source]

Bases: SpecError

Raised when the same architecture occurs in a spec twice.

exception spack.spec.DuplicateCompilerSpecError(message, long_message=None)[source]

Bases: SpecError

Raised when the same compiler occurs in a spec twice.

exception spack.spec.DuplicateDependencyError(message, long_message=None)[source]

Bases: SpecError

Raised when the same dependency occurs in a spec twice.

exception spack.spec.InconsistentSpecError(message, long_message=None)[source]

Bases: SpecError

Raised when two nodes in the same spec DAG have inconsistent constraints.

exception spack.spec.InvalidDependencyError(pkg, deps)[source]

Bases: SpecError

Raised when a dependency in a spec is not actually a dependency of the package.

exception spack.spec.InvalidHashError(spec, hash)[source]

Bases: SpecError

exception spack.spec.MultipleProviderError(vpkg, providers)[source]

Bases: SpecError

Raised when there is no package that provides a particular virtual dependency.

exception spack.spec.NoProviderError(vpkg)[source]

Bases: SpecError

Raised when there is no package that provides a particular virtual dependency.

class spack.spec.Spec(spec_like=None, normal=False, concrete=False, external_path=None, external_modules=None)[source]

Bases: object

abstract_hash = None
add_dependency_edge(dependency_spec: Spec, *, depflag: int, virtuals: Tuple[str, ...])[source]

Add a dependency edge to this spec.

Parameters:
  • dependency_spec – spec of the dependency

  • deptypes – dependency types for this edge

  • virtuals – virtuals provided by this edge

property anonymous
attach_git_version_lookup()[source]
property build_spec
cformat(*args, **kwargs)[source]

Same as format, but color defaults to auto instead of False.

clear_cached_hashes(ignore=())[source]

Clears all cached hashes in a Spec, while preserving other properties.

clear_dependencies()[source]

Trim the dependencies of this spec.

clear_edges()[source]

Trim the dependencies and dependents of this spec.

property colored_str
colorized()[source]
common_dependencies(other)[source]

Return names of dependencies that self an other have in common.

property concrete

A spec is concrete if it describes a single build of a package.

More formally, a spec is concrete if concretize() has been called on it and it has been marked _concrete.

Concrete specs either can be or have been built. All constraints have been resolved, optional dependencies have been added or removed, a compiler has been chosen, and all variants have values.

concretize(tests=False)[source]

Concretize the current spec.

Parameters:

tests (bool or list) – if False disregard ‘test’ dependencies, if a list of names activate them for the packages in the list, if True activate ‘test’ dependencies for all packages.

concretized(tests=False)[source]

This is a non-destructive version of concretize().

First clones, then returns a concrete version of this package without modifying this package.

Parameters:

tests (bool or list) – if False disregard ‘test’ dependencies, if a list of names activate them for the packages in the list, if True activate ‘test’ dependencies for all packages.

constrain(other, deps=True)[source]

Intersect self with other in-place. Return True if self changed, False otherwise.

Parameters:
  • other – constraint to be added to self

  • deps – if False, constrain only the root node, otherwise constrain dependencies as well.

Raises:

spack.error.UnsatisfiableSpecError – when self cannot be constrained

constrained(other, deps=True)[source]

Return a constrained copy without modifying this spec.

copy(deps: bool | str | List[str] | Tuple[str, ...] | int = True, **kwargs)[source]

Make a copy of this spec.

Parameters:
  • deps – Defaults to True. If boolean, controls whether dependencies are copied (copied if True). If a DepTypes or DepFlag is provided, only matching dependencies are copied.

  • kwargs – additional arguments for internal use (passed to _dup).

Returns:

A copy of this spec.

Examples

Deep copy with dependencies:

spec.copy()
spec.copy(deps=True)

Shallow copy (no dependencies):

spec.copy(deps=False)

Only build and run dependencies:

deps=('build', 'run'):
property cshort_spec

Returns an auto-colorized version of self.short_spec.

dag_hash(length=None)[source]

This is Spack’s default hash, used to identify installations.

Same as the full hash (includes package hash and build/link/run deps). Tells us when package files and any dependencies have changes.

NOTE: Versions of Spack prior to 0.18 only included link and run deps.

dag_hash_bit_prefix(bits)[source]

Get the first <bits> bits of the DAG hash as an integer type.

static default_arch()[source]

Return an anonymous spec for the default architecture

dependencies(name=None, deptype: str | List[str] | Tuple[str, ...] | int = 15) List[Spec][source]

Return a list of direct dependencies (nodes in the DAG).

Parameters:
  • name (str) – filter dependencies by package name

  • deptype – allowed dependency types

dependents(name=None, deptype: str | List[str] | Tuple[str, ...] | int = 15) List[Spec][source]

Return a list of direct dependents (nodes in the DAG).

Parameters:
  • name (str) – filter dependents by package name

  • deptype – allowed dependency types

detach(deptype='all')[source]

Remove any reference that dependencies have of this node.

Parameters:

deptype (str or tuple) – dependency types tracked by the current spec

direct_dep_difference(other)[source]

Returns dependencies in self that are not in other.

property edge_attributes: str

Helper method to print edge attributes in spec literals

edges_from_dependents(name=None, depflag: int = 15)[source]

Return a list of edges connecting this node in the DAG to parents.

Parameters:
  • name (str) – filter dependents by package name

  • depflag – allowed dependency types

edges_to_dependencies(name=None, depflag: int = 15)[source]

Return a list of edges connecting this node in the DAG to children.

Parameters:
  • name (str) – filter dependencies by package name

  • depflag – allowed dependency types

static ensure_external_path_if_external(external_spec)[source]
static ensure_no_deprecated(root)[source]

Raise if a deprecated spec is in the dag.

Parameters:

root (Spec) – root spec to be analyzed

Raises:

SpecDeprecatedError – if any deprecated spec is found

static ensure_valid_variants(spec)[source]

Ensures that the variant attached to a spec are valid.

Parameters:

spec (Spec) – spec to be analyzed

Raises:

spack.variant.UnknownVariantError – on the first unknown variant found

eq_dag(other, deptypes=True, vs=None, vo=None)[source]

True if the full dependency DAGs of specs are equal.

eq_node(other)[source]

Equality with another spec, not including dependencies.

property external
property external_path
static extract_json_from_clearsig(data)[source]
flat_dependencies(disconnect: bool = False)[source]

Build DependencyMap of all of this spec’s dependencies with their constraints merged.

Parameters:

disconnect – if True, disconnect all dependents and dependencies among nodes in this spec’s DAG.

format(format_string='{name}{@versions}{%compiler.name}{@compiler.versions}{compiler_flags}{variants}{arch=architecture}{/abstract_hash}', **kwargs)[source]

Prints out particular pieces of a spec, depending on what is in the format string.

Using the {attribute} syntax, any field of the spec can be selected. Those attributes can be recursive. For example, s.format({compiler.version}) will print the version of the compiler.

Commonly used attributes of the Spec for format strings include:

name
version
compiler
compiler.name
compiler.version
compiler_flags
variants
architecture
architecture.platform
architecture.os
architecture.target
prefix

Some additional special-case properties can be added:

hash[:len]    The DAG hash with optional length argument
spack_root    The spack root directory
spack_install The spack install directory

The ^ sigil can be used to access dependencies by name. s.format({^mpi.name}) will print the name of the MPI implementation in the spec.

The @, %, arch=, and / sigils can be used to include the sigil with the printed string. These sigils may only be used with the appropriate attributes, listed below:

@        ``{@version}``, ``{@compiler.version}``
%        ``{%compiler}``, ``{%compiler.name}``
arch=    ``{arch=architecture}``
/        ``{/hash}``, ``{/hash:7}``, etc

The @ sigil may also be used for any other property named version. Sigils printed with the attribute string are only printed if the attribute string is non-empty, and are colored according to the color of the attribute.

Sigils are not used for printing variants. Variants listed by name naturally print with their sigil. For example, spec.format('{variants.debug}') would print either +debug or ~debug depending on the name of the variant. Non-boolean variants print as name=value. To print variant names or values independently, use spec.format('{variants.<name>.name}') or spec.format('{variants.<name>.value}').

Spec format strings use \ as the escape character. Use \{ and \} for literal braces, and \\ for the literal \ character.

Parameters:

format_string (str) – string containing the format to be expanded

Keyword Arguments:
  • color (bool) – True if returned string is colored

  • transform (dict) – maps full-string formats to a callable that accepts a string and returns another one

format_path(format_string: str, _path_ctor: Callable[[Any], PurePath] | None = None) str[source]

Given a format_string that is intended as a path, generate a string like from Spec.format, but eliminate extra path separators introduced by formatting of Spec properties.

Path separators explicitly added to the string are preserved, so for example “{name}/{version}” would generate a directory based on the Spec’s name, and a subdirectory based on its version; this function guarantees though that the resulting string would only have two directories (i.e. that if under normal circumstances that str(Spec.version) would contain a path separator, it would not in this case).

static from_detection(spec_str, extra_attributes=None)[source]

Construct a spec from a spec string determined during external detection and attach extra attributes to it.

Parameters:
  • spec_str (str) – spec string

  • extra_attributes (dict) – dictionary containing extra attributes

Returns:

external spec

Return type:

spack.spec.Spec

static from_dict(data)[source]

Construct a spec from JSON/YAML.

Parameters:

data – a nested dict/list data structure read from YAML or JSON.

static from_json(stream)[source]

Construct a spec from JSON.

Parameters:

stream – string or file object to read from.

static from_literal(spec_dict, normal=True)[source]

Builds a Spec from a dictionary containing the spec literal.

The dictionary must have a single top level key, representing the root, and as many secondary level keys as needed in the spec.

The keys can be either a string or a Spec or a tuple containing the Spec and the dependency types.

Parameters:
  • spec_dict (dict) – the dictionary containing the spec literal

  • normal (bool) – if True the same key appearing at different levels of the spec_dict will map to the same object in memory.

Examples

A simple spec foo with no dependencies:

{'foo': None}

A spec foo with a (build, link) dependency bar:

{'foo':
    {'bar:build,link': None}}

A spec with a diamond dependency and various build types:

{'dt-diamond': {
    'dt-diamond-left:build,link': {
        'dt-diamond-bottom:build': None
    },
    'dt-diamond-right:build,link': {
        'dt-diamond-bottom:build,link,run': None
    }
}}

The same spec with a double copy of dt-diamond-bottom and no diamond structure:

{'dt-diamond': {
    'dt-diamond-left:build,link': {
        'dt-diamond-bottom:build': None
    },
    'dt-diamond-right:build,link': {
        'dt-diamond-bottom:build,link,run': None
    }
}, normal=False}

Constructing a spec using a Spec object as key:

mpich = Spec('mpich')
libelf = Spec('libelf@1.8.11')
expected_normalized = Spec.from_literal({
    'mpileaks': {
        'callpath': {
            'dyninst': {
                'libdwarf': {libelf: None},
                libelf: None
            },
            mpich: None
        },
        mpich: None
    },
})
static from_signed_json(stream)[source]

Construct a spec from clearsigned json spec file.

Parameters:

stream – string or file object to read from.

static from_specfile(path)[source]

Construct a spec from a JSON or YAML spec file path

static from_yaml(stream)[source]

Construct a spec from YAML.

Parameters:

stream – string or file object to read from.

property fullname
index(deptype='all')[source]

Return a dictionary that points to all the dependencies in this spec.

static inject_patches_variant(root)[source]
install_status()[source]

Helper for tree to print DB install status.

property installed

Installation status of a package.

Returns:

True if the package has been installed, False otherwise.

property installed_upstream

Whether the spec is installed in an upstream repository.

Returns:

True if the package is installed in an upstream, False otherwise.

intersects(other: str | Spec, deps: bool = True) bool[source]

Return True if there exists at least one concrete spec that matches both self and other, otherwise False.

This operation is commutative, and if two specs intersect it means that one can constrain the other.

Parameters:
  • other – spec to be checked for compatibility

  • deps – if True check compatibility of dependency nodes too, if False only check root

lookup_hash()[source]

Given a spec with an abstract hash, return a copy of the spec with all properties and dependencies by looking up the hash in the environment, store, or finally, binary caches. This is non-destructive.

node_dict_with_hashes(hash=<spack.hash_types.SpecHashDescriptor object>)[source]

Returns a node_dict of this spec with the dag hash added. If this spec is concrete, the full hash is added as well. If ‘build’ is in the hash_type, the build hash is also added.

normalize(force=False, tests=False, user_spec_deps=None, disconnect=True)[source]

When specs are parsed, any dependencies specified are hanging off the root, and ONLY the ones that were explicitly provided are there. Normalization turns a partial flat spec into a DAG, where:

  1. Known dependencies of the root package are in the DAG.

  2. Each node’s dependencies dict only contains its known direct deps.

  3. There is only ONE unique spec for each package in the DAG.

    • This includes virtual packages. If there a non-virtual package that provides a virtual package that is in the spec, then we replace the virtual package with the non-virtual one.

TODO: normalize should probably implement some form of cycle detection, to ensure that the spec is actually a DAG.

normalized()[source]

Return a normalized copy of this spec without modifying this spec.

property os
static override(init_spec, change_spec)[source]
property package
property package_class

Internal package call gets only the class object for a package. Use this to just get package metadata.

package_hash()[source]

Compute the hash of the contents of the package for this node

property patches

Return patch objects for any patch sha256 sums on this Spec.

This is for use after concretization to iterate over any patches associated with this spec.

TODO: this only checks in the package; it doesn’t resurrect old patches from install directories, but it probably should.

property platform
property prefix
process_hash(length=None)[source]

Hash used to transfer specs among processes.

This hash includes build and test dependencies and is only used to serialize a spec and pass it around among processes.

process_hash_bit_prefix(bits)[source]

Get the first <bits> bits of the DAG hash as an integer type.

replace_hash()[source]

Given a spec with an abstract hash, attempt to populate all properties and dependencies by looking up the hash in the environment, store, or finally, binary caches. This is destructive.

property root

Follow dependent links and find the root of this spec’s DAG.

Spack specs have a single root (the package being installed).

satisfies(other: str | Spec, deps: bool = True) bool[source]

Return True if all concrete specs matching self also match other, otherwise False.

Parameters:
  • other – spec to be satisfied

  • deps – if True descend to dependencies, otherwise only check root node

property short_spec

Returns a version of the spec with the dependencies hashed instead of completely enumerated.

spec_hash(hash)[source]

Utility method for computing different types of Spec hashes.

Parameters:

hash (spack.hash_types.SpecHashDescriptor) – type of hash to generate.

splice(other, transitive)[source]

Splices dependency “other” into this (“target”) Spec, and return the result as a concrete Spec. If transitive, then other and its dependencies will be extrapolated to a list of Specs and spliced in accordingly. For example, let there exist a dependency graph as follows: T | Z<-H In this example, Spec T depends on H and Z, and H also depends on Z. Suppose, however, that we wish to use a different H, known as H’. This function will splice in the new H’ in one of two ways: 1. transitively, where H’ depends on the Z’ it was built with, and the new T* also directly depends on this new Z’, or 2. intransitively, where the new T* and H’ both depend on the original Z. Since the Spec returned by this splicing function is no longer deployed the same way it was built, any such changes are tracked by setting the build_spec to point to the corresponding dependency from the original Spec. TODO: Extend this for non-concrete Specs.

property spliced

Returns whether or not this Spec is being deployed as built i.e. whether or not this Spec has ever been spliced.

property target
to_dict(hash=<spack.hash_types.SpecHashDescriptor object>)[source]

Create a dictionary suitable for writing this spec to YAML or JSON.

This dictionaries like the one that is ultimately written to a spec.json file in each Spack installation directory. For example, for sqlite:

{
"spec": {
    "_meta": {
    "version": 2
    },
    "nodes": [
    {
        "name": "sqlite",
        "version": "3.34.0",
        "arch": {
        "platform": "darwin",
        "platform_os": "catalina",
        "target": "x86_64"
        },
        "compiler": {
        "name": "apple-clang",
        "version": "11.0.0"
        },
        "namespace": "builtin",
        "parameters": {
        "column_metadata": true,
        "fts": true,
        "functions": false,
        "rtree": false,
        "cflags": [],
        "cppflags": [],
        "cxxflags": [],
        "fflags": [],
        "ldflags": [],
        "ldlibs": []
        },
        "dependencies": [
        {
            "name": "readline",
            "hash": "4f47cggum7p4qmp3xna4hi547o66unva",
            "type": [
            "build",
            "link"
            ]
        },
        {
            "name": "zlib",
            "hash": "uvgh6p7rhll4kexqnr47bvqxb3t33jtq",
            "type": [
            "build",
            "link"
            ]
        }
        ],
        "hash": "tve45xfqkfgmzwcyfetze2z6syrg7eaf",
    },
        # ... more node dicts for readline and its dependencies ...
    ]
}

Note that this dictionary starts with the ‘spec’ key, and what follows is a list starting with the root spec, followed by its dependencies in preorder. Each node in the list also has a ‘hash’ key that contains the hash of the node without the hash field included.

In the example, the package content hash is not included in the spec, but if package_hash were true there would be an additional field on each node called package_hash.

from_dict() can be used to read back in a spec that has been converted to a dictionary, serialized, and read back in.

Parameters:
  • deptype (tuple or str) – dependency types to include when traversing the spec.

  • package_hash (bool) – whether to include package content hashes in the dictionary.

to_json(stream=None, hash=<spack.hash_types.SpecHashDescriptor object>)[source]
to_node_dict(hash=<spack.hash_types.SpecHashDescriptor object>)[source]

Create a dictionary representing the state of this Spec.

to_node_dict creates the content that is eventually hashed by Spack to create identifiers like the DAG hash (see dag_hash()). Example result of to_node_dict for the sqlite package:

{
    'sqlite': {
        'version': '3.28.0',
        'arch': {
            'platform': 'darwin',
            'platform_os': 'mojave',
            'target': 'x86_64',
        },
        'compiler': {
            'name': 'apple-clang',
            'version': '10.0.0',
        },
        'namespace': 'builtin',
        'parameters': {
            'fts': 'true',
            'functions': 'false',
            'cflags': [],
            'cppflags': [],
            'cxxflags': [],
            'fflags': [],
            'ldflags': [],
            'ldlibs': [],
        },
        'dependencies': {
            'readline': {
                'hash': 'zvaa4lhlhilypw5quj3akyd3apbq5gap',
                'type': ['build', 'link'],
            }
        },
    }
}

Note that the dictionary returned does not include the hash of the root of the spec, though it does include hashes for each dependency, and (optionally) the package file corresponding to each node.

See to_dict() for a “complete” spec hash, with hashes for each node and nodes for each dependency (instead of just their hashes).

Parameters:

hash (spack.hash_types.SpecHashDescriptor) –

to_yaml(stream=None, hash=<spack.hash_types.SpecHashDescriptor object>)[source]
traverse(**kwargs)[source]

Shorthand for traverse_nodes()

traverse_edges(**kwargs)[source]

Shorthand for traverse_edges()

tree(*, color: bool | None = None, depth: bool = False, hashes: bool = False, hashlen: int | None = None, cover: str = 'nodes', indent: int = 0, format: str = '{name}{@versions}{%compiler.name}{@compiler.versions}{compiler_flags}{variants}{arch=architecture}{/abstract_hash}', deptypes: Tuple[str, ...] | str = 'all', show_types: bool = False, depth_first: bool = False, recurse_dependencies: bool = True, status_fn: Callable[[Spec], InstallStatus] | None = None, prefix: Callable[[Spec], str] | None = None) str[source]

Prints out this spec and its dependencies, tree-formatted with indentation.

Status function may either output a boolean or an InstallStatus

Parameters:
  • color – if True, always colorize the tree. If False, don’t colorize the tree. If None, use the default from llnl.tty.color

  • depth – print the depth from the root

  • hashes – if True, print the hash of each node

  • hashlen – length of the hash to be printed

  • cover – either “nodes” or “edges”

  • indent – extra indentation for the tree being printed

  • format – format to be used to print each node

  • deptypes – dependency types to be represented in the tree

  • show_types – if True, show the (merged) dependency type of a node

  • depth_first – if True, traverse the DAG depth first when representing it as a tree

  • recurse_dependencies – if True, recurse on dependencies

  • status_fn – optional callable that takes a node as an argument and return its installation status

  • prefix – optional callable that takes a node as an argument and return its installation prefix

trim(dep_name)[source]

Remove any package that is or provides dep_name transitively from this tree. This can also remove other dependencies if they are only present because of dep_name.

update_variant_validate(variant_name, values)[source]

If it is not already there, adds the variant named variant_name to the spec spec based on the definition contained in the package metadata. Validates the variant and values before returning.

Used to add values to a variant without being sensitive to the variant being single or multi-valued. If the variant already exists on the spec it is assumed to be multi-valued and the values are appended.

Parameters:
  • variant_name – the name of the variant to add or append to

  • values – the value or values (as a tuple) to add/append to the variant

validate_detection()[source]

Validate the detection of an external spec.

This method is used as part of Spack’s detection protocol, and is not meant for client code use.

validate_or_raise()[source]

Checks that names and values in this spec are real. If they’re not, it will raise an appropriate exception.

property version
property virtual
virtual_dependencies()[source]

Return list of any virtual deps in this spec.

exception spack.spec.SpecDeprecatedError(message, long_message=None)[source]

Bases: SpecError

Raised when a spec concretizes to a deprecated spec or dependency.

exception spack.spec.SpecParseError(parse_error)[source]

Bases: SpecError

Wrapper for ParseError for when we’re parsing specs.

property long_message
exception spack.spec.UnsatisfiableArchitectureSpecError(provided, required)[source]

Bases: UnsatisfiableSpecError

Raised when a spec architecture conflicts with package constraints.

exception spack.spec.UnsatisfiableCompilerFlagSpecError(provided, required)[source]

Bases: UnsatisfiableSpecError

Raised when a spec variant conflicts with package constraints.

exception spack.spec.UnsatisfiableCompilerSpecError(provided, required)[source]

Bases: UnsatisfiableSpecError

Raised when a spec comiler conflicts with package constraints.

exception spack.spec.UnsatisfiableDependencySpecError(provided, required)[source]

Bases: UnsatisfiableSpecError

Raised when some dependency of constrained specs are incompatible

exception spack.spec.UnsatisfiableProviderSpecError(provided, required)[source]

Bases: UnsatisfiableSpecError

Raised when a provider is supplied but constraints don’t match a vpkg requirement

exception spack.spec.UnsatisfiableSpecNameError(provided, required)[source]

Bases: UnsatisfiableSpecError

Raised when two specs aren’t even for the same package.

exception spack.spec.UnsatisfiableVersionSpecError(provided, required)[source]

Bases: UnsatisfiableSpecError

Raised when a spec version conflicts with package constraints.

exception spack.spec.UnsupportedCompilerError(compiler_name)[source]

Bases: SpecError

Raised when the user asks for a compiler spack doesn’t know about.

spack.spec_list module

exception spack.spec_list.InvalidSpecConstraintError(message, long_message=None)[source]

Bases: SpecListError

Error class for invalid spec constraints at concretize time.

class spack.spec_list.SpecList(name='specs', yaml_list=None, reference=None)[source]

Bases: object

add(spec)[source]
extend(other, copy_reference=True)[source]
property is_matrix
remove(spec)[source]
replace(idx: int, spec: str)[source]

Replace the existing spec at the index with the new one.

Parameters:
  • idx – index of the spec to replace in the speclist

  • spec – new spec

property specs: List[Spec]
property specs_as_constraints
property specs_as_yaml_list
update_reference(reference)[source]
exception spack.spec_list.SpecListError(message, long_message=None)[source]

Bases: SpackError

Error class for all errors related to SpecList objects.

exception spack.spec_list.UndefinedReferenceError(message, long_message=None)[source]

Bases: SpecListError

Error class for undefined references in Spack stacks.

spack.stage module

class spack.stage.DIYStage(path)[source]

Bases: object

Simple class that allows any directory to be a spack stage. Consequently, it does not expect or require that the source path adhere to the standard directory naming convention.

cache_local()[source]
check()[source]
create()[source]
destroy()[source]
expand_archive()[source]
property expanded

Returns True since the source_path must exist.

fetch(*args, **kwargs)[source]
needs_fetching = False
requires_patch_success = False
restage()[source]
class spack.stage.DevelopStage(name, dev_path, reference_link)[source]

Bases: LockableStagingDir

property archive_file
cache_local()[source]
check()[source]
create()[source]

Ensures the top-level (config:build_stage) directory exists.

destroy()[source]
expand_archive()[source]
property expanded

Returns True since the source_path must exist.

fetch(*args, **kwargs)[source]
needs_fetching = False
requires_patch_success = False
restage()[source]
class spack.stage.LockableStagingDir(name, path, keep, lock)[source]

Bases: object

A directory whose lifetime can be managed with a context manager (but persists if the user requests it). Instances can have a specified name and if they do, then for all instances that have the same name, only one can enter the context manager at a time.

create()[source]

Ensures the top-level (config:build_stage) directory exists.

destroy()[source]
class spack.stage.ResourceStage(fetch_strategy: FetchStrategy, root: Stage, resource: Resource, **kwargs)[source]

Bases: Stage

expand_archive()[source]

Changes to the stage directory and attempt to expand the downloaded archive. Fail if the stage is not set up or if the archive is not yet downloaded.

restage()[source]

Removes the expanded archive path if it exists, then re-expands the archive.

exception spack.stage.RestageError(message, long_message=None)[source]

Bases: StageError

“Error encountered during restaging.

class spack.stage.Stage(url_or_fetch_strategy, name=None, mirror_paths=None, keep=False, path=None, lock=True, search_fn=None)[source]

Bases: LockableStagingDir

Manages a temporary stage directory for building.

A Stage object is a context manager that handles a directory where some source code is downloaded and built before being installed. It handles fetching the source code, either as an archive to be expanded or by checking it out of a repository. A stage’s lifecycle looks like this:

with Stage() as stage:      # Context manager creates and destroys the
                            # stage directory
    stage.fetch()           # Fetch a source archive into the stage.
    stage.expand_archive()  # Expand the archive into source_path.
    <install>               # Build and install the archive.
                            # (handled by user of Stage)

When used as a context manager, the stage is automatically destroyed if no exception is raised by the context. If an excpetion is raised, the stage is left in the filesystem and NOT destroyed, for potential reuse later.

You can also use the stage’s create/destroy functions manually, like this:

stage = Stage()
try:
    stage.create()          # Explicitly create the stage directory.
    stage.fetch()           # Fetch a source archive into the stage.
    stage.expand_archive()  # Expand the archive into source_path.
    <install>               # Build and install the archive.
                            # (handled by user of Stage)
finally:
    stage.destroy()         # Explicitly destroy the stage directory.

There are two kinds of stages: named and unnamed. Named stages can persist between runs of spack, e.g. if you fetched a tarball but didn’t finish building it, you won’t have to fetch it again.

Unnamed stages are created using standard mkdtemp mechanisms or similar, and are intended to persist for only one run of spack.

property archive_file

Path to the source archive within this stage directory.

cache_local()[source]
cache_mirror(mirror, stats)[source]

Perform a fetch if the resource is not already cached

Parameters:
check()[source]

Check the downloaded archive against a checksum digest. No-op if this stage checks code out of a repository.

destroy()[source]

Removes this stage directory.

disable_mirrors()[source]

The Stage will not attempt to look for the associated fetcher target in any of Spack’s mirrors (including the local download cache).

expand_archive()[source]

Changes to the stage directory and attempt to expand the downloaded archive. Fail if the stage is not set up or if the archive is not yet downloaded.

property expanded

Returns True if source path expanded; else False.

property expected_archive_files

Possible archive file paths.

fetch(mirror_only=False, err_msg=None)[source]

Retrieves the code or archive

Parameters:
  • mirror_only (bool) – only fetch from a mirror

  • err_msg (str or None) – the error message to display if all fetchers fail or None for the default fetch failure message

needs_fetching = True

Most staging is managed by Spack. DIYStage is one exception.

requires_patch_success = True
restage()[source]

Removes the expanded archive path if it exists, then re-expands the archive.

property save_filename
property source_path

Returns the well-known source directory path.

steal_source(dest)[source]

Copy the source_path directory in its entirety to directory dest

This operation creates/fetches/expands the stage if it is not already, and destroys the stage when it is done.

class spack.stage.StageComposite[source]

Bases: Composite

Composite for Stage type objects. The first item in this composite is considered to be the root package, and operations that return a value are forwarded to it.

property archive_file
property expanded
classmethod from_iterable(iterable: Iterable[Stage]) StageComposite[source]

Create a new composite from an iterable of stages.

property keep
property path
property source_path
exception spack.stage.StageError(message, long_message=None)[source]

Bases: SpackError

“Superclass for all errors encountered during staging.

exception spack.stage.StagePathError(message, long_message=None)[source]

Bases: StageError

“Error encountered with stage path.

exception spack.stage.VersionFetchError(message, long_message=None)[source]

Bases: StageError

Raised when we can’t determine a URL to fetch a package.

spack.stage.compute_stage_name(spec)[source]

Determine stage name given a spec

spack.stage.create_stage_root(path: str) None[source]

Create the stage root directory and ensure appropriate access perms.

spack.stage.ensure_access(file)[source]

Ensure we can access a directory and die with an error if we can’t.

spack.stage.get_checksums_for_versions(url_by_version: Dict[StandardVersion, str], package_name: str, *, first_stage_function: Callable[[Stage, str], None] | None = None, keep_stage: bool = False, concurrency: int | None = None, fetch_options: Dict[str, str] | None = None) Dict[StandardVersion, str][source]

Computes the checksums for each version passed in input, and returns the results.

Archives are fetched according to the usl dictionary passed as input.

The first_stage_function argument allows the caller to inspect the first downloaded archive, e.g., to determine the build system.

Parameters:
  • url_by_version – URL keyed by version

  • package_name – name of the package

  • first_stage_function – function that takes a Stage and a URL; this is run on the stage of the first URL downloaded

  • keep_stage – whether to keep staging area when command completes

  • batch – whether to ask user how many versions to fetch (false) or fetch all versions (true)

  • fetch_options – options used for the fetcher (such as timeout or cookies)

  • concurrency – maximum number of workers to use for retrieving archives

Returns:

A dictionary mapping each version to the corresponding checksum

spack.stage.get_stage_root()[source]
spack.stage.interactive_version_filter(url_dict: ~typing.Dict[~spack.version.version_types.StandardVersion, str], known_versions: ~typing.Iterable[~spack.version.version_types.StandardVersion] = (), *, initial_verion_filter: ~spack.version.version_types.VersionList | None = None, url_changes: ~typing.Set[~spack.version.version_types.StandardVersion] = {}, input: ~typing.Callable[[...], str] = <built-in function input>) Dict[StandardVersion, str] | None[source]

Interactively filter the list of spidered versions.

Parameters:
  • url_dict – Dictionary of versions to URLs

  • known_versions – Versions that can be skipped because they are already known

Returns:

Filtered dictionary of versions to URLs or None if the user wants to quit

spack.stage.purge()[source]

Remove all build directories in the top-level stage path.

spack.store module

Components that manage Spack’s installation tree.

An install tree, or “build store” consists of two parts:

  1. A package database that tracks what is installed.

  2. A directory layout that determines how the installations are laid out.

The store contains all the install prefixes for packages installed by Spack. The simplest store could just contain prefixes named by DAG hash, but we use a fancier directory layout to make browsing the store and debugging easier.

spack.store.DEFAULT_INSTALL_TREE_ROOT = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/latest/lib/spack/docs/_spack_root/opt/spack'

default installation root, relative to the Spack install path

exception spack.store.MatchError(message, long_message=None)[source]

Bases: SpackError

Error occurring when trying to match specs in store against a constraint

spack.store.STORE: Store | Singleton = <spack.store.Store object>

Singleton store instance

class spack.store.Store(root: str, unpadded_root: str | None = None, projections: Dict[str, str] | None = None, hash_length: int | None = None, upstreams: List[Database] | None = None, lock_cfg: LockConfiguration = (False, None, None))[source]

Bases: object

A store is a path full of installed Spack packages.

Stores consist of packages installed according to a DirectoryLayout, along with a database of their contents.

The directory layout controls what paths look like and how Spack ensures that each unique spec gets its own unique directory (or not, though we don’t recommend that).

The database is a single file that caches metadata for the entire Spack installation. It prevents us from having to spider the install tree to figure out what’s there.

The store is also able to lock installation prefixes, and to mark installation failures.

Parameters:
  • root – path to the root of the install tree

  • unpadded_root – path to the root of the install tree without padding. The sbang script has to be installed here to work with padded roots

  • projections – expression according to guidelines that describes how to construct a path to a package prefix in this store

  • hash_length – length of the hashes used in the directory layout. Spec hash suffixes will be truncated to this length

  • upstreams – optional list of upstream databases

  • lock_cfg – lock configuration for the database

reindex() None[source]

Convenience function to reindex the store DB with its own layout.

spack.store.create(configuration: Configuration | Singleton) Store[source]

Create a store from the configuration passed as input.

Parameters:

configuration – configuration to create a store.

spack.store.ensure_singleton_created() None[source]

Ensures the lazily evaluated singleton is created

spack.store.find(constraints: str | List[str] | List[Spec], multiple: bool = False, query_fn: Callable[[Any], List[Spec]] | None = None, **kwargs) List[Spec][source]

Returns a list of specs matching the constraints passed as inputs.

At least one spec per constraint must match, otherwise the function will error with an appropriate message.

By default, this function queries the current store, but a custom query function can be passed to hit any other source of concretized specs (e.g. a binary cache).

The query function must accept a spec as its first argument.

Parameters:
  • constraints – spec(s) to be matched against installed packages

  • multiple – if True multiple matches per constraint are admitted

  • query_fn (Callable) – query function to get matching specs. By default, spack.store.STORE.db.query

  • **kwargs – keyword arguments forwarded to the query function

spack.store.parse_install_tree(config_dict)[source]

Parse config settings and return values relevant to the store object.

Parameters:

config_dict (dict) – dictionary of config values, as returned from spack.config.get(‘config’)

Returns:

triple of the install tree root, the unpadded install tree

root (before padding was applied), and the projections for the install tree

Return type:

(tuple)

Encapsulate backwards compatibility capabilities for install_tree and deprecated values that are now parsed as part of install_tree.

spack.store.reinitialize()[source]

Restore globals to the same state they would have at start-up. Return a token containing the state of the store before reinitialization.

spack.store.restore(token)[source]

Restore the environment from a token returned by reinitialize

spack.store.specfile_matches(filename: str, **kwargs) List[Spec][source]

Same as find but reads the query from a spec file.

Parameters:
  • filename – YAML or JSON file from which to read the query.

  • **kwargs – keyword arguments forwarded to “find”

spack.store.use_store(path: str | Path, extra_data: Dict[str, Any] | None = None) Generator[Store, None, None][source]

Use the store passed as argument within the context manager.

Parameters:
  • path – path to the store.

  • extra_data – extra configuration under “config:install_tree” to be taken into account.

Yields:

Store object associated with the context manager’s store

spack.subprocess_context module

This module handles transmission of Spack state to child processes started using the ‘spawn’ start method. Notably, installations are performed in a subprocess and require transmitting the Package object (in such a way that the repository is available for importing when it is deserialized); installations performed in Spack unit tests may include additional modifications to global state in memory that must be replicated in the child process.

class spack.subprocess_context.PackageInstallContext(pkg)[source]

Bases: object

Captures the in-memory process state of a package installation that needs to be transmitted to a child process.

restore()[source]
class spack.subprocess_context.SpackTestProcess(fn)[source]

Bases: object

create()[source]
class spack.subprocess_context.TestPatches(module_patches, class_patches)[source]

Bases: object

restore()[source]
class spack.subprocess_context.TestState[source]

Bases: object

Spack tests may modify state that is normally read from disk in memory; this object is responsible for properly serializing that state to be applied to a subprocess. This isn’t needed outside of a testing environment but this logic is designed to behave the same inside or outside of tests.

restore()[source]
spack.subprocess_context.append_patch(patch)[source]
spack.subprocess_context.clear_patches()[source]
spack.subprocess_context.serialize(obj)[source]
spack.subprocess_context.store_patches()[source]

spack.tag module

Classes and functions to manage package tags

class spack.tag.TagIndex(repository)[source]

Bases: Mapping

Maps tags to list of packages.

copy()[source]

Return a deep copy of this index.

static from_json(stream, repository)[source]
get_packages(tag)[source]

Returns all packages associated with the tag.

merge(other)[source]

Merge another tag index into this one.

Parameters:

other (TagIndex) – tag index to be merged

property tags
to_json(stream)[source]
update_package(pkg_name)[source]

Updates a package in the tag index.

Parameters:

pkg_name (str) – name of the package to be removed from the index

exception spack.tag.TagIndexError(message, long_message=None)[source]

Bases: SpackError

Raised when there is a problem with a TagIndex.

spack.tag.packages_with_tags(tags, installed, skip_empty)[source]

Returns a dict, indexed by tag, containing lists of names of packages containing the tag or, if no tags, for all available tags.

Parameters:
  • tags (list or None) – list of tags of interest or None for all

  • installed (bool) – True if want names of packages that are installed; otherwise, False if want all packages with the tag

  • skip_empty (bool) – True if exclude tags with no associated packages; otherwise, False if want entries for all tags even when no such tagged packages

spack.target module

class spack.target.Target(name, module_name=None)[source]

Bases: object

static from_dict_or_value(dict_or_value)[source]
property name
optimization_flags(compiler)[source]

Returns the flags needed to optimize for this target using the compiler passed as argument.

Parameters:

compiler (spack.spec.CompilerSpec or spack.compiler.Compiler) – object that contains both the name and the version of the compiler we want to use

to_dict_or_value()[source]

Returns a dict or a value representing the current target.

String values are used to keep backward compatibility with generic targets, like e.g. x86_64 or ppc64. More specific micro-architectures will return a dictionary which contains information on the name, features, vendor, generation and parents of the current target.

spack.tengine module

class spack.tengine.Context[source]

Bases: object

Base class for context classes that are used with the template engine.

context_properties = []
to_dict()[source]

Returns a dictionary containing all the context properties.

class spack.tengine.ContextMeta(name, bases, attr_dict)[source]

Bases: type

Meta class for Context. It helps reducing the boilerplate in client code.

classmethod context_property(func)[source]

Decorator that adds a function name to the list of new context properties, and then returns a property.

spack.tengine.context_property(func)

A saner way to use the decorator

spack.tengine.curly_quote(text)[source]

Encloses each line of text in curly braces

spack.tengine.make_environment(dirs: Tuple[str, ...] | None = None)[source]

Returns a configured environment for template rendering.

spack.tengine.prepend_to_line(text, token)[source]

Prepends a token to each line in text

spack.tengine.quote(text)[source]

Quotes each line in text

spack.traverse module

spack.traverse.traverse_edges(specs, root=True, order='pre', cover='nodes', direction='children', deptype: int | str | ~typing.List[str] | ~typing.Tuple[str, ...] = 'all', depth=False, key=<built-in function id>, visited=None)[source]

Generator that yields edges from the DAG, starting from a list of root specs.

Parameters:
  • specs (list) – List of root specs (considered to be depth 0)

  • root (bool) – Yield the root nodes themselves

  • order (str) – What order of traversal to use in the DAG. For depth-first search this can be pre or post. For BFS this should be breadth. For topological order use topo

  • cover (str) – Determines how extensively to cover the dag. Possible values: nodes – Visit each unique node in the dag only once. edges – If a node has been visited once but is reached along a new path, it’s accepted, but not recurisvely followed. This traverses each ‘edge’ in the DAG once. paths – Explore every unique path reachable from the root. This descends into visited subtrees and will accept nodes multiple times if they’re reachable by multiple paths.

  • direction (str) – children or parents. If children, does a traversal of this spec’s children. If parents, traverses upwards in the DAG towards the root.

  • deptype – allowed dependency types

  • depth (bool) – When False, yield just edges. When True yield the tuple (depth, edge), where depth corresponds to the depth at which edge.spec was discovered.

  • key – function that takes a spec and outputs a key for uniqueness test.

  • visited (set or None) – a set of nodes not to follow

Returns:

A generator that yields DependencySpec if depth is False or a tuple of (depth, DependencySpec) if depth is True.

spack.traverse.traverse_nodes(specs, root=True, order='pre', cover='nodes', direction='children', deptype: int | str | ~typing.List[str] | ~typing.Tuple[str, ...] = 'all', depth=False, key=<built-in function id>, visited=None)[source]

Generator that yields specs from the DAG, starting from a list of root specs.

Parameters:
  • specs (list) – List of root specs (considered to be depth 0)

  • root (bool) – Yield the root nodes themselves

  • order (str) – What order of traversal to use in the DAG. For depth-first search this can be pre or post. For BFS this should be breadth.

  • cover (str) – Determines how extensively to cover the dag. Possible values: nodes – Visit each unique node in the dag only once. edges – If a node has been visited once but is reached along a new path, it’s accepted, but not recurisvely followed. This traverses each ‘edge’ in the DAG once. paths – Explore every unique path reachable from the root. This descends into visited subtrees and will accept nodes multiple times if they’re reachable by multiple paths.

  • direction (str) – children or parents. If children, does a traversal of this spec’s children. If parents, traverses upwards in the DAG towards the root.

  • deptype – allowed dependency types

  • depth (bool) – When False, yield just edges. When True yield the tuple (depth, edge), where depth corresponds to the depth at which edge.spec was discovered.

  • key – function that takes a spec and outputs a key for uniqueness test.

  • visited (set or None) – a set of nodes not to follow

Yields:

By default Spec, or a tuple (depth, Spec) if depth is set to True.

spack.traverse.traverse_tree(specs, cover='nodes', deptype: int | str | ~typing.List[str] | ~typing.Tuple[str, ...] = 'all', key=<built-in function id>, depth_first=True)[source]

Generator that yields (depth, DependencySpec) tuples in the depth-first pre-order, so that a tree can be printed from it.

Parameters:
  • specs (list) – List of root specs (considered to be depth 0)

  • cover (str) – Determines how extensively to cover the dag. Possible values: nodes – Visit each unique node in the dag only once. edges – If a node has been visited once but is reached along a new path, it’s accepted, but not recurisvely followed. This traverses each ‘edge’ in the DAG once. paths – Explore every unique path reachable from the root. This descends into visited subtrees and will accept nodes multiple times if they’re reachable by multiple paths.

  • deptype – allowed dependency types

  • key – function that takes a spec and outputs a key for uniqueness test.

  • depth_first (bool) – Explore the tree in depth-first or breadth-first order. When setting depth_first=True and cover=nodes, each spec only occurs once at the shallowest level, which is useful when rendering the tree in a terminal.

Returns:

A generator that yields (depth, DependencySpec) tuples in such an order that a tree can be printed.

spack.url module

This module has methods for parsing names and versions of packages from URLs. The idea is to allow package creators to supply nothing more than the download location of the package, and figure out version and name information from there.

Example: when spack is given the following URL:

It can figure out that the package name is hdf, and that it is at version 4.2.12. This is useful for making the creation of packages simple: a user just supplies a URL and skeleton code is generated automatically.

Spack can also figure out that it can most likely download 4.2.6 at this URL:

This is useful if a user asks for a package at a particular version number; spack doesn’t need anyone to tell it where to get the tarball even though it’s never been told about that version before.

exception spack.url.UndetectableNameError(path)[source]

Bases: UrlParseError

Raised when we can’t parse a package name from a string.

exception spack.url.UndetectableVersionError(path)[source]

Bases: UrlParseError

Raised when we can’t parse a version from a string.

exception spack.url.UrlParseError(msg, path)[source]

Bases: SpackError

Raised when the URL module can’t parse something correctly.

spack.url.color_url(path, **kwargs)[source]

Color the parts of the url according to Spack’s parsing.

Colors are:
Cyan: The version found by parse_version_offset().
Red: The name found by parse_name_offset().
Green: Instances of version string from substitute_version().
Magenta: Instances of the name (protected from substitution).
Parameters:
  • path (str) – The filename or URL for the package

  • errors (bool) – Append parse errors at end of string.

  • subs (bool) – Color substitutions as well as parsed name/version.

spack.url.find_all(substring, string)[source]

Returns a list containing the indices of every occurrence of substring in string.

spack.url.find_versions_of_archive(archive_urls: str | Sequence[str], list_url: str | None = None, list_depth: int = 0, concurrency: int | None = 32, reference_package: Any | None = None) Dict[StandardVersion, str][source]

Scrape web pages for new versions of a tarball. This function prefers URLs in the following order: links found on the scraped page that match a url generated by the reference package, found and in the archive_urls list, found and derived from those in the archive_urls list, and if none are found for a version then the item in the archive_urls list is included for the version.

Parameters:
  • archive_urls – URL or sequence of URLs for different versions of a package. Typically these are just the tarballs from the package file itself. By default, this searches the parent directories of archives.

  • list_url – URL for a listing of archives. Spack will scrape these pages for download links that look like the archive URL.

  • list_depth – max depth to follow links on list_url pages. Defaults to 0.

  • concurrency – maximum number of concurrent requests

  • reference_package – a spack package used as a reference for url detection. Uses the url_for_version method on the package to produce reference urls which, if found, are preferred.

spack.url.parse_name(path, ver=None)[source]

Try to determine the name of a package from its filename or URL.

Parameters:
  • path (str) – The filename or URL for the package

  • ver (str) – The version of the package

Returns:

The name of the package

Return type:

str

Raises:

UndetectableNameError – If the URL does not match any regexes

spack.url.parse_name_and_version(path)[source]

Try to determine the name of a package and extract its version from its filename or URL.

Parameters:

path (str) – The filename or URL for the package

Returns:

a tuple containing the package (name, version)

Return type:

tuple

Raises:
spack.url.parse_name_offset(path, v=None)[source]

Try to determine the name of a package from its filename or URL.

Parameters:
  • path (str) – The filename or URL for the package

  • v (str) – The version of the package

Returns:

A tuple containing:

name of the package, first index of name, length of name, the index of the matching regex, the matching regex

Return type:

tuple

Raises:

UndetectableNameError – If the URL does not match any regexes

spack.url.parse_version(path: str) StandardVersion[source]

Try to extract a version string from a filename or URL.

Parameters:

path – The filename or URL for the package

Returns: The version of the package

Raises:

UndetectableVersionError – If the URL does not match any regexes

spack.url.parse_version_offset(path)[source]

Try to extract a version string from a filename or URL.

Parameters:

path (str) – The filename or URL for the package

Returns:

A tuple containing:

version of the package, first index of version, length of version string, the index of the matching regex, the matching regex

Return type:

tuple

Raises:

UndetectableVersionError – If the URL does not match any regexes

spack.url.strip_name_suffixes(path, version)[source]

Most tarballs contain a package name followed by a version number. However, some also contain extraneous information in-between the name and version:

  • rgb-1.0.6

  • converge_install_2.3.16

  • jpegsrc.v9b

These strings are not part of the package name and should be ignored. This function strips the version number and any extraneous suffixes off and returns the remaining string. The goal is that the name is always the last thing in path:

  • rgb

  • converge

  • jpeg

Parameters:
  • path (str) – The filename or URL for the package

  • version (str) – The version detected for this URL

Returns:

The path with any extraneous suffixes removed

Return type:

str

spack.url.substitute_version(path, new_version)[source]

Given a URL or archive name, find the version in the path and substitute the new version for it. Replace all occurrences of the version if they don’t overlap with the package name.

Simple example:

substitute_version('http://www.mr511.de/software/libelf-0.8.13.tar.gz', '2.9.3')
>>> 'http://www.mr511.de/software/libelf-2.9.3.tar.gz'

Complex example:

substitute_version('https://www.hdfgroup.org/ftp/HDF/releases/HDF4.2.12/src/hdf-4.2.12.tar.gz', '2.3')
>>> 'https://www.hdfgroup.org/ftp/HDF/releases/HDF2.3/src/hdf-2.3.tar.gz'
spack.url.substitution_offsets(path)[source]

This returns offsets for substituting versions and names in the provided path. It is a helper for substitute_version().

spack.url.wildcard_version(path)[source]

Find the version in the supplied path, and return a regular expression that will match this path with any version in its place.

spack.user_environment module

spack.user_environment.environment_modifications_for_specs(*specs: Spec, view=None, set_package_py_globals: bool = True)[source]

List of environment (shell) modifications to be processed for spec.

This list is specific to the location of the spec or its projection in the view.

Parameters:
  • specs – spec(s) for which to list the environment modifications

  • view – view associated with the spec passed as first argument

  • set_package_py_globals – whether or not to set the global variables in the package.py files (this may be problematic when using buildcaches that have been built on a different but compatible OS)

spack.user_environment.prefix_inspections(platform)[source]

Get list of prefix inspections for platform

Parameters:

platform (str) – the name of the platform to consider. The platform determines what environment variables Spack will use for some inspections.

Returns:

A dictionary mapping subdirectory names to lists of environment

variables to modify with that directory if it exists.

spack.user_environment.project_env_mods(*specs: Spec, view, env: EnvironmentModifications) None[source]

Given a list of environment modifications, project paths changes to the view.

spack.user_environment.spack_loaded_hashes_var = 'SPACK_LOADED_HASHES'

Environment variable name Spack uses to track individually loaded packages

spack.user_environment.unconditional_environment_modifications(view)[source]

List of environment (shell) modifications to be processed for view.

This list does not depend on the specs in this environment

spack.variant module

The variant module contains data structures that are needed to manage variants both in packages and in specs.

class spack.variant.AbstractVariant(name, value, propagate=False)[source]

Bases: object

A variant that has not yet decided who it wants to be. It behaves like a multi valued variant which could do things.

This kind of variant is generated during parsing of expressions like foo=bar and differs from multi valued variants because it will satisfy any other variant with the same name. This is because it could do it if it grows up to be a multi valued variant with the right set of values.

compatible(other)[source]

Returns True if self and other are compatible, False otherwise.

As there is no semantic check, two VariantSpec are compatible if either they contain the same value or they are both multi-valued.

Parameters:

other – instance against which we test compatibility

Returns:

True or False

Return type:

bool

constrain(other)[source]

Modify self to match all the constraints for other if both instances are multi-valued. Returns True if self changed, False otherwise.

Parameters:

other – instance against which we constrain self

Returns:

True or False

Return type:

bool

copy()[source]

Returns an instance of a variant equivalent to self

Returns:

a copy of self

Return type:

AbstractVariant

>>> a = MultiValuedVariant('foo', True)
>>> b = a.copy()
>>> assert a == b
>>> assert a is not b
static from_node_dict(name, value)[source]

Reconstruct a variant from a node dict.

intersects(other)[source]

Returns True if there are variant matching both self and other, False otherwise.

satisfies(other)[source]

Returns true if other.name == self.name, because any value that other holds and is not in self yet could be added.

Parameters:

other – constraint to be met for the method to return True

Returns:

True or False

Return type:

bool

property value

Returns a tuple of strings containing the values stored in the variant.

Returns:

values stored in the variant

Return type:

tuple

yaml_entry()[source]

Returns a key, value tuple suitable to be an entry in a yaml dict.

Returns:

(name, value_representation)

Return type:

tuple

class spack.variant.BoolValuedVariant(name, value, propagate=False)[source]

Bases: SingleValuedVariant

A variant that can hold either True or False.

BoolValuedVariant can also hold the value ‘*’, for coerced comparisons between foo=* and +foo or ~foo.

class spack.variant.DisjointSetsOfValues(*sets)[source]

Bases: Sequence

Allows combinations from one of many mutually exclusive sets.

The value ('none',) is reserved to denote the empty set and therefore no other set can contain the item 'none'.

Parameters:

*sets (list) – mutually exclusive sets of values

allow_empty_set()[source]

Adds the empty set to the current list of disjoint sets.

feature_values

Attribute used to track values which correspond to features which can be enabled or disabled as understood by the package’s build system.

prohibit_empty_set()[source]

Removes the empty set from the current list of disjoint sets.

property validator
with_default(default)[source]

Sets the default value and returns self.

with_error(error_fmt)[source]

Sets the error message format and returns self.

with_non_feature_values(*values)[source]

Marks a few values as not being tied to a feature.

exception spack.variant.DuplicateVariantError(message, long_message=None)[source]

Bases: SpecError

Raised when the same variant occurs in a spec twice.

exception spack.variant.InconsistentValidationError(vspec, variant)[source]

Bases: SpecError

Raised if the wrong validator is used to validate a variant.

exception spack.variant.InvalidVariantForSpecError(variant, when, spec)[source]

Bases: SpecError

Raised when an invalid conditional variant is specified.

exception spack.variant.InvalidVariantValueCombinationError(message, long_message=None)[source]

Bases: SpecError

Raised when a variant has values ‘*’ or ‘none’ with other values.

exception spack.variant.InvalidVariantValueError(variant, invalid_values, pkg)[source]

Bases: SpecError

Raised when a valid variant has at least an invalid value.

class spack.variant.MultiValuedVariant(name, value, propagate=False)[source]

Bases: AbstractVariant

A variant that can hold multiple values at once.

append(value)[source]

Add another value to this multi-valued variant.

satisfies(other)[source]

Returns true if other.name == self.name and other.value is a strict subset of self. Does not try to validate.

Parameters:

other – constraint to be met for the method to return True

Returns:

True or False

Return type:

bool

exception spack.variant.MultipleValuesInExclusiveVariantError(variant, pkg)[source]

Bases: SpecError, ValueError

Raised when multiple values are present in a variant that wants only one.

class spack.variant.SingleValuedVariant(name, value, propagate=False)[source]

Bases: AbstractVariant

A variant that can hold multiple values, but one at a time.

compatible(other)[source]

Returns True if self and other are compatible, False otherwise.

As there is no semantic check, two VariantSpec are compatible if either they contain the same value or they are both multi-valued.

Parameters:

other – instance against which we test compatibility

Returns:

True or False

Return type:

bool

constrain(other)[source]

Modify self to match all the constraints for other if both instances are multi-valued. Returns True if self changed, False otherwise.

Parameters:

other – instance against which we constrain self

Returns:

True or False

Return type:

bool

intersects(other)[source]

Returns True if there are variant matching both self and other, False otherwise.

satisfies(other)[source]

Returns true if other.name == self.name, because any value that other holds and is not in self yet could be added.

Parameters:

other – constraint to be met for the method to return True

Returns:

True or False

Return type:

bool

yaml_entry()[source]

Returns a key, value tuple suitable to be an entry in a yaml dict.

Returns:

(name, value_representation)

Return type:

tuple

exception spack.variant.UnknownVariantError(spec, variants)[source]

Bases: SpecError

Raised when an unknown variant occurs in a spec.

exception spack.variant.UnsatisfiableVariantSpecError(provided, required)[source]

Bases: UnsatisfiableSpecError

Raised when a spec variant conflicts with package constraints.

class spack.variant.Value(value, when)[source]

Bases: object

Conditional value that might be used in variants.

class spack.variant.Variant(name, default, description, values=(True, False), multi=False, validator=None, sticky=False)[source]

Bases: object

Represents a variant in a package, as declared in the variant directive.

property allowed_values

Returns a string representation of the allowed values for printing purposes

Returns:

representation of the allowed values

Return type:

str

make_default()[source]

Factory that creates a variant holding the default value.

Returns:

instance of the proper variant

Return type:

MultiValuedVariant or SingleValuedVariant or BoolValuedVariant

make_variant(value)[source]

Factory that creates a variant holding the value passed as a parameter.

Parameters:

value – value that will be hold by the variant

Returns:

instance of the proper variant

Return type:

MultiValuedVariant or SingleValuedVariant or BoolValuedVariant

validate_or_raise(vspec, pkg_cls=None)[source]

Validate a variant spec against this package variant. Raises an exception if any error is found.

Parameters:
Raises:
property variant_cls

Proper variant class to be used for this configuration.

class spack.variant.VariantMap(spec)[source]

Bases: HashableMap

Map containing variant instances. New values can be added only if the key is not already present.

property concrete

Returns True if the spec is concrete in terms of variants.

Returns:

True or False

Return type:

bool

constrain(other)[source]

Add all variants in other that aren’t in self to self. Also constrain all multi-valued variants that are already present. Return True if self changed, False otherwise

Parameters:

other (VariantMap) – instance against which we constrain self

Returns:

True or False

Return type:

bool

copy()[source]

Return an instance of VariantMap equivalent to self.

Returns:

a copy of self

Return type:

VariantMap

dict
intersects(other)[source]
satisfies(other)[source]
substitute(vspec)[source]

Substitutes the entry under vspec.name with vspec.

Parameters:

vspec – variant spec to be substituted

spack.variant.any_combination_of(*values)[source]

Multi-valued variant that allows any combination of the specified values, and also allows the user to specify ‘none’ (as a string) to choose none of them.

It is up to the package implementation to handle the value ‘none’ specially, if at all.

Parameters:

*values – allowed variant values

Returns:

a properly initialized instance of DisjointSetsOfValues

spack.variant.auto_or_any_combination_of(*values)[source]

Multi-valued variant that allows any combination of a set of values (but not the empty set) or ‘auto’.

Parameters:

*values – allowed variant values

Returns:

a properly initialized instance of DisjointSetsOfValues

spack.variant.conditional(*values, **kwargs)[source]

Conditional values that can be used in variant declarations.

spack.variant.disjoint_sets(*sets)[source]

Multi-valued variant that allows any combination picking from one of multiple disjoint sets of values, and also allows the user to specify ‘none’ (as a string) to choose none of them.

It is up to the package implementation to handle the value ‘none’ specially, if at all.

Parameters:

*sets

Returns:

a properly initialized instance of DisjointSetsOfValues

spack.variant.implicit_variant_conversion(method)[source]

Converts other to type(self) and calls method(self, other)

Parameters:

method – any predicate method that takes another variant as an argument

Returns: decorated method

spack.variant.substitute_abstract_variants(spec)[source]

Uses the information in spec.package to turn any variant that needs it into a SingleValuedVariant.

This method is best effort. All variants that can be substituted will be substituted before any error is raised.

Parameters:

spec – spec on which to operate the substitution

spack.verify module

class spack.verify.VerificationResults[source]

Bases: object

add_error(path, field)[source]
has_errors()[source]
json_string()[source]
spack.verify.check_entry(path, data)[source]
spack.verify.check_file_manifest(filename)[source]
spack.verify.check_spec_manifest(spec)[source]
spack.verify.compute_hash(path: str, block_size: int = 1048576) str[source]
spack.verify.create_manifest_entry(path: str) Dict[str, Any][source]
spack.verify.write_manifest(spec)[source]