spack package

Subpackages

Submodules

spack.abi module

class spack.abi.ABI[source]

Bases: object

This class provides methods to test ABI compatibility between specs. The current implementation is rather rough and could be improved.

architecture_compatible(target, constraint)[source]

Return true if architecture of target spec is ABI compatible to the architecture of constraint spec. If either the target or constraint specs have no architecture, target is also defined as architecture ABI compatible to constraint.

compatible(target, constraint, **kwargs)[source]

Returns true if target spec is ABI compatible to constraint spec

compiler_compatible(parent, child, **kwargs)[source]

Return true if compilers for parent and child are ABI compatible.

spack.audit module

Classes and functions to register audit checks for various parts of Spack and run them on-demand.

To register a new class of sanity checks (e.g. sanity checks for compilers.yaml), the first action required is to create a new AuditClass object:

audit_cfgcmp = AuditClass(
    tag='CFG-COMPILER',
    description='Sanity checks on compilers.yaml',
    kwargs=()
)

This object is to be used as a decorator to register functions that will perform each a single check:

@audit_cfgcmp
def _search_duplicate_compilers(error_cls):
    pass

These functions need to take as argument the keywords declared when creating the decorator object plus an error_cls argument at the end, acting as a factory to create Error objects. It should return a (possibly empty) list of errors.

Calls to each of these functions are triggered by the run method of the decorator object, that will forward the keyword arguments passed as input.

class spack.audit.AuditClass(group, tag, description, kwargs)[source]

Bases: collections.abc.Sequence

run(**kwargs)[source]
spack.audit.CALLBACKS = {'CFG-COMPILER': <spack.audit.AuditClass object>, 'CFG-PACKAGES': <spack.audit.AuditClass object>, 'GENERIC': <spack.audit.AuditClass object>, 'PKG-DIRECTIVES': <spack.audit.AuditClass object>, 'PKG-HTTPS-DIRECTIVES': <spack.audit.AuditClass object>}

Map an audit tag to a list of callables implementing checks

class spack.audit.Error(summary, details)[source]

Bases: object

Information on an error reported in a test.

spack.audit.GROUPS = {'configs': ['CFG-COMPILER', 'CFG-PACKAGES'], 'generic': ['GENERIC'], 'packages': ['PKG-DIRECTIVES'], 'packages-https': ['PKG-HTTPS-DIRECTIVES']}

Map a group of checks to the list of related audit tags

spack.audit.config_compiler = <spack.audit.AuditClass object>

Sanity checks on compilers.yaml

spack.audit.config_packages = <spack.audit.AuditClass object>

Sanity checks on packages.yaml

spack.audit.generic = <spack.audit.AuditClass object>

Generic checks relying on global state

spack.audit.package_directives = <spack.audit.AuditClass object>

Sanity checks on package directives

spack.audit.run_check(tag, **kwargs)[source]

Run the checks associated with a single tag.

Parameters
  • tag (str) – tag of the check

  • **kwargs – keyword arguments forwarded to the checks

Returns

Errors occurred during the checks

spack.audit.run_group(group, **kwargs)[source]

Run the checks that are part of the group passed as argument.

Parameters
  • group (str) – group of checks to be run

  • **kwargs – keyword arguments forwarded to the checks

Returns

List of (tag, errors) that failed.

spack.binary_distribution module

class spack.binary_distribution.BinaryCacheIndex(cache_root)[source]

Bases: object

The BinaryCacheIndex tracks what specs are available on (usually remote) binary caches.

This index is “best effort”, in the sense that whenever we don’t find what we’re looking for here, we will attempt to fetch it directly from configured mirrors anyway. Thus, it has the potential to speed things up, but cache misses shouldn’t break any spack functionality.

At the moment, everything in this class is initialized as lazily as possible, so that it avoids slowing anything in spack down until absolutely necessary.

TODO: What’s the cost if, e.g., we realize in the middle of a spack install that the cache is out of date, and we fetch directly? Does it mean we should have paid the price to update the cache earlier?

clear()[source]

For testing purposes we need to be able to empty the cache and clear associated data structures.

find_built_spec(spec)[source]

Look in our cache for the built spec corresponding to spec.

If the spec can be found among the configured binary mirrors, a list is returned that contains the concrete spec and the mirror url of each mirror where it can be found. Otherwise, None is returned.

This method does not trigger reading anything from remote mirrors, but rather just checks if the concrete spec is found within the cache.

The cache can be updated by calling update() on the cache.

Parameters

spec (spack.spec.Spec) – Concrete spec to find

Returns

An list of objects containing the found specs and mirror url where

each can be found, e.g.:

[
    {
        "spec": <concrete-spec>,
        "mirror_url": <mirror-root-url>
    }
]

find_by_hash(find_hash)[source]

Same as find_built_spec but uses the hash of a spec.

Parameters

find_hash (str) – hash of the spec to search

get_all_built_specs()[source]
regenerate_spec_cache(clear_existing=False)[source]

Populate the local cache of concrete specs (_mirrors_for_spec) from the locally cached buildcache index files. This is essentially a no-op if it has already been done, as we keep track of the index hashes for which we have already associated the built specs.

update()[source]

Make sure local cache of buildcache index files is up to date. If the same mirrors are configured as the last time this was called and none of the remote buildcache indices have changed, calling this method will only result in fetching the index hash from each mirror to confirm it is the same as what is stored locally. Otherwise, the buildcache index.json and index.json.hash files are retrieved from each configured mirror and stored locally (both in memory and on disk under _index_cache_root).

update_spec(spec, found_list)[source]

Take list of {‘mirror_url’: m, ‘spec’: s} objects and update the local built_spec_cache

exception spack.binary_distribution.FetchCacheError(errors)[source]

Bases: Exception

Error thrown when fetching the cache failed, usually a composite error list.

exception spack.binary_distribution.NewLayoutException(msg)[source]

Bases: spack.error.SpackError

Raised if directory layout is different from buildcache.

exception spack.binary_distribution.NoChecksumException(message, long_message=None)[source]

Bases: spack.error.SpackError

Raised if file fails checksum verification.

exception spack.binary_distribution.NoGpgException(msg)[source]

Bases: spack.error.SpackError

Raised when gpg2 is not in PATH

exception spack.binary_distribution.NoKeyException(msg)[source]

Bases: spack.error.SpackError

Raised when gpg has no default key added.

exception spack.binary_distribution.NoOverwriteException(file_path)[source]

Bases: spack.error.SpackError

Raised when a file exists and must be overwritten.

exception spack.binary_distribution.NoVerifyException(message, long_message=None)[source]

Bases: spack.error.SpackError

Raised if file fails signature verification.

exception spack.binary_distribution.PickKeyException(keys)[source]

Bases: spack.error.SpackError

Raised when multiple keys can be used to sign.

spack.binary_distribution.binary_index = <spack.binary_distribution.BinaryCacheIndex object>

Singleton binary_index instance

spack.binary_distribution.binary_index_location()[source]

Set up a BinaryCacheIndex for remote buildcache dbs in the user’s homedir.

spack.binary_distribution.build_cache_keys_relative_path()[source]
spack.binary_distribution.build_cache_prefix(prefix)[source]
spack.binary_distribution.build_cache_relative_path()[source]
spack.binary_distribution.build_tarball(spec, outdir, force=False, rel=False, unsigned=False, allow_root=False, key=None, regenerate_index=False)[source]

Build a tarball from given spec and put it into the directory structure used at the mirror (following <tarball_directory_name>).

spack.binary_distribution.buildinfo_file_name(prefix)[source]

Filename of the binary package meta-data file

spack.binary_distribution.check_package_relocatable(workdir, spec, allow_root)[source]

Check if package binaries are relocatable. Change links to placeholder links.

spack.binary_distribution.check_specs_against_mirrors(mirrors, specs, output_file=None, rebuild_on_errors=False)[source]

Check all the given specs against buildcaches on the given mirrors and determine if any of the specs need to be rebuilt. Reasons for needing to rebuild include binary cache for spec isn’t present on a mirror, or it is present but the full_hash has changed since last time spec was built.

Parameters
  • mirrors (dict) – Mirrors to check against

  • specs (typing.Iterable) – Specs to check against mirrors

  • output_file (str) – Path to output file to be written. If provided, mirrors with missing or out-of-date specs will be formatted as a JSON object and written to this file.

  • rebuild_on_errors (bool) – Treat any errors encountered while checking specs as a signal to rebuild package.

Returns: 1 if any spec was out-of-date on any mirror, 0 otherwise.

spack.binary_distribution.checksum_tarball(file)[source]
spack.binary_distribution.clear_spec_cache()[source]
spack.binary_distribution.compute_hash(data)[source]
spack.binary_distribution.download_buildcache_entry(file_descriptions, mirror_url=None)[source]
spack.binary_distribution.download_tarball(spec, preferred_mirrors=None)[source]

Download binary tarball for given package into stage area, returning path to downloaded tarball if successful, None otherwise.

Parameters
  • spec (spack.spec.Spec) – Concrete spec

  • preferred_mirrors (list) – If provided, this is a list of preferred mirror urls. Other configured mirrors will only be used if the tarball can’t be retrieved from one of these.

Returns

Path to the downloaded tarball, or None if the tarball could not be downloaded from any configured mirrors.

spack.binary_distribution.extract_tarball(spec, filename, allow_root=False, unsigned=False, force=False)[source]

extract binary tarball for given package into install area

spack.binary_distribution.generate_key_index(key_prefix, tmpdir=None)[source]

Create the key index page.

Creates (or replaces) the “index.json” page at the location given in key_prefix. This page contains an entry for each key (.pub) under key_prefix.

spack.binary_distribution.generate_package_index(cache_prefix)[source]

Create the build cache index page.

Creates (or replaces) the “index.json” page at the location given in cache_prefix. This page contains a link for each binary package (.yaml or .json) under cache_prefix.

spack.binary_distribution.get_buildfile_manifest(spec)[source]

Return a data structure with information about a build, including text_to_relocate, binary_to_relocate, binary_to_relocate_fullpath link_to_relocate, and other, which means it doesn’t fit any of previous checks (and should not be relocated). We blacklist docs (man) and metadata (.spack). This can be used to find a particular kind of file in spack, or to generate the build metadata.

spack.binary_distribution.get_keys(install=False, trust=False, force=False, mirrors=None)[source]

Get pgp public keys available on mirror with suffix .pub

spack.binary_distribution.get_mirrors_for_spec(spec=None, full_hash_match=False, mirrors_to_check=None, index_only=False)[source]

Check if concrete spec exists on mirrors and return a list indicating the mirrors on which it can be found

Parameters
  • spec (spack.spec.Spec) – The spec to look for in binary mirrors

  • full_hash_match (bool) – If True, only includes mirrors where the spec full hash matches the locally computed full hash of the spec argument. If False, any mirror which has a matching DAG hash is included in the results.

  • mirrors_to_check (dict) – Optionally override the configured mirrors with the mirrors in this dictionary.

  • index_only (bool) – Do not attempt direct fetching of spec.json files from remote mirrors, only consider the indices.

Returns

A list of objects, each containing a mirror_url and spec key

indicating all mirrors where the spec can be found.

spack.binary_distribution.make_package_relative(workdir, spec, allow_root)[source]

Change paths in binaries to relative paths. Change absolute symlinks to relative symlinks.

spack.binary_distribution.needs_rebuild(spec, mirror_url, rebuild_on_errors=False)[source]
spack.binary_distribution.push_keys(*mirrors, **kwargs)[source]

Upload pgp public keys to the given mirrors

spack.binary_distribution.read_buildinfo_file(prefix)[source]

Read buildinfo file

spack.binary_distribution.relocate_package(spec, allow_root)[source]

Relocate the given package

spack.binary_distribution.select_signing_key(key=None)[source]
spack.binary_distribution.sign_tarball(key, force, specfile_path)[source]
spack.binary_distribution.tarball_directory_name(spec)[source]

Return name of the tarball directory according to the convention <os>-<architecture>/<compiler>/<package>-<version>/

spack.binary_distribution.tarball_name(spec, ext)[source]

Return the name of the tarfile according to the convention <os>-<architecture>-<package>-<dag_hash><ext>

spack.binary_distribution.tarball_path_name(spec, ext)[source]

Return the full path+name for a given spec according to the convention <tarball_directory_name>/<tarball_name>

spack.binary_distribution.try_direct_fetch(spec, full_hash_match=False, mirrors=None)[source]

Try to find the spec directly on the configured mirrors

spack.binary_distribution.update_cache_and_get_specs()[source]

Get all concrete specs for build caches available on configured mirrors. Initialization of internal cache data structures is done as lazily as possible, so this method will also attempt to initialize and update the local index cache (essentially a no-op if it has been done already and nothing has changed on the configured mirrors.)

Throws:

FetchCacheError

spack.binary_distribution.write_buildinfo_file(spec, workdir, rel=False)[source]

Create a cache file containing information required for the relocation

spack.bootstrap module

spack.bootstrap.black_root_spec()[source]
spack.bootstrap.clingo_root_spec()[source]

Return the root spec used to bootstrap clingo

spack.bootstrap.ensure_black_in_path_or_raise()[source]

Ensure that isort is in the PATH or raise.

spack.bootstrap.ensure_bootstrap_configuration()[source]
spack.bootstrap.ensure_clingo_importable_or_raise()[source]

Ensure that the clingo module is available for import.

spack.bootstrap.ensure_executables_in_path_or_raise(executables, abstract_spec)[source]

Ensure that some executables are in path or raise.

Parameters
  • executables (list) – list of executables to be searched in the PATH, in order. The function exits on the first one found.

  • abstract_spec (str) – abstract spec that provides the executables

Raises

RuntimeError – if the executables cannot be ensured to be in PATH

Returns

Executable object

spack.bootstrap.ensure_flake8_in_path_or_raise()[source]

Ensure that flake8 is in the PATH or raise.

spack.bootstrap.ensure_gpg_in_path_or_raise()[source]

Ensure gpg or gpg2 are in the PATH or raise.

spack.bootstrap.ensure_isort_in_path_or_raise()[source]

Ensure that isort is in the PATH or raise.

spack.bootstrap.ensure_module_importable_or_raise(module, abstract_spec=None)[source]

Make the requested module available for import, or raise.

This function tries to import a Python module in the current interpreter using, in order, the methods configured in bootstrap.yaml.

If none of the methods succeed, an exception is raised. The function exits on first success.

Parameters
  • module (str) – module to be imported in the current interpreter

  • abstract_spec (str) – abstract spec that might provide the module. If not given it defaults to “module”

Raises

ImportError – if the module couldn’t be imported

spack.bootstrap.ensure_mypy_in_path_or_raise()[source]

Ensure that mypy is in the PATH or raise.

spack.bootstrap.ensure_patchelf_in_path_or_raise()[source]

Ensure patchelf is in the PATH or raise.

spack.bootstrap.flake8_root_spec()[source]
spack.bootstrap.gnupg_root_spec()[source]

Return the root spec used to bootstrap GnuPG

spack.bootstrap.isort_root_spec()[source]
spack.bootstrap.mypy_root_spec()[source]
spack.bootstrap.patchelf_root_spec()[source]

Return the root spec used to bootstrap patchelf

spack.bootstrap.spack_python_interpreter()[source]

Override the current configuration to set the interpreter under which Spack is currently running as the only Python external spec available.

spack.bootstrap.spec_for_current_python()[source]

For bootstrapping purposes we are just interested in the Python minor version (all patches are ABI compatible with the same minor) and on whether ucs4 support has been enabled for Python 2.7

See:

https://www.python.org/dev/peps/pep-0513/ https://stackoverflow.com/a/35801395/771663

spack.bootstrap.store_path()[source]

Path to the store used for bootstrapped software

spack.build_environment module

This module contains all routines related to setting up the package build environment. All of this is set up by package.py just before install() is called.

There are two parts to the build environment:

  1. Python build environment (i.e. install() method)

    This is how things are set up when install() is called. Spack takes advantage of each package being in its own module by adding a bunch of command-like functions (like configure(), make(), etc.) in the package’s module scope. Ths allows package writers to call them all directly in Package.install() without writing ‘self.’ everywhere. No, this isn’t Pythonic. Yes, it makes the code more readable and more like the shell script from which someone is likely porting.

  2. Build execution environment

    This is the set of environment variables, like PATH, CC, CXX, etc. that control the build. There are also a number of environment variables used to pass information (like RPATHs and other information about dependencies) to Spack’s compiler wrappers. All of these env vars are also set up here.

Skimming this module is a nice way to get acquainted with the types of calls you can make from within the install() function.

exception spack.build_environment.ChildError(msg, module, classname, traceback_string, log_name, log_type, context)[source]

Bases: spack.build_environment.InstallError

Special exception class for wrapping exceptions from child processes

in Spack’s build environment.

The main features of a ChildError are:

  1. They’re serializable, so when a child build fails, we can send one of these to the parent and let the parent report what happened.

  2. They have a traceback field containing a traceback generated on the child immediately after failure. Spack will print this on failure in lieu of trying to run sys.excepthook on the parent process, so users will see the correct stack trace from a child.

  3. They also contain context, which shows context in the Package implementation where the error happened. This helps people debug Python code in their packages. To get it, Spack searches the stack trace for the deepest frame where self is in scope and is an instance of PackageBase. This will generally find a useful spot in the package.py file.

The long_message of a ChildError displays one of two things:

  1. If the original error was a ProcessError, indicating a command died during the build, we’ll show context from the build log.

  2. If the original error was any other type of error, we’ll show context from the Python code.

SpackError handles displaying the special traceback if we’re in debug mode with spack -d.

build_errors = [('spack.util.executable', 'ProcessError')]
property long_message
exception spack.build_environment.InstallError(message, long_message=None)[source]

Bases: spack.error.SpackError

Raised by packages when a package fails to install.

Any subclass of InstallError will be annotated by Spack wtih a pkg attribute on failure, which the caller can use to get the package for which the exception was raised.

class spack.build_environment.MakeExecutable(name, jobs)[source]

Bases: spack.util.executable.Executable

Special callable executable object for make so the user can specify parallelism options on a per-invocation basis. Specifying ‘parallel’ to the call will override whatever the package’s global setting is, so you can either default to true or false and override particular calls. Specifying ‘jobs_env’ to a particular call will name an environment variable which will be set to the parallelism level (without affecting the normal invocation with -j).

Note that if the SPACK_NO_PARALLEL_MAKE env var is set it overrides everything.

exception spack.build_environment.StopPhase(message, long_message=None)[source]

Bases: spack.error.SpackError

Pickle-able exception to control stopped builds.

spack.build_environment.clean_environment()[source]
spack.build_environment.determine_number_of_jobs(parallel=False, command_line=None, config_default=None, max_cpus=None)[source]

Packages that require sequential builds need 1 job. Otherwise we use the number of jobs set on the command line. If not set, then we use the config defaults (which is usually set through the builtin config scope), but we cap to the number of CPUs available to avoid oversubscription.

Parameters
  • parallel (bool or None) – true when package supports parallel builds

  • command_line (int or None) – command line override

  • config_default (int or None) – config default number of jobs

  • max_cpus (int or None) – maximum number of CPUs available. When None, this value is automatically determined.

spack.build_environment.get_cmake_prefix_path(pkg)[source]
spack.build_environment.get_package_context(traceback, context=3)[source]

Return some context for an error message when the build fails.

Parameters
  • traceback – A traceback from some exception raised during install

  • context (int) – Lines of context to show before and after the line where the error happened

This function inspects the stack to find where we failed in the package file, and it adds detailed context to the long_message from there.

spack.build_environment.get_rpath_deps(pkg)[source]

Return immediate or transitive RPATHs depending on the package.

spack.build_environment.get_rpaths(pkg)[source]

Get a list of all the rpaths for a package.

spack.build_environment.get_std_cmake_args(pkg)[source]

List of standard arguments used if a package is a CMakePackage.

Returns

standard arguments that would be used if this package were a CMakePackage instance.

Return type

list

Parameters

pkg (spack.package.PackageBase) – package under consideration

Returns

arguments for cmake

Return type

list

spack.build_environment.get_std_meson_args(pkg)[source]

List of standard arguments used if a package is a MesonPackage.

Returns

standard arguments that would be used if this package were a MesonPackage instance.

Return type

list

Parameters

pkg (spack.package.PackageBase) – package under consideration

Returns

arguments for meson

Return type

list

spack.build_environment.load_external_modules(pkg)[source]

Traverse a package’s spec DAG and load any external modules.

Traverse a package’s dependencies and load any external modules associated with them.

Parameters

pkg (spack.package.PackageBase) – package to load deps for

spack.build_environment.modifications_from_dependencies(spec, context, custom_mods_only=True)[source]

Returns the environment modifications that are required by the dependencies of a spec and also applies modifications to this spec’s package at module scope, if need be.

Environment modifications include:

  • Updating PATH so that executables can be found

  • Updating CMAKE_PREFIX_PATH and PKG_CONFIG_PATH so that their respective tools can find Spack-built dependencies

  • Running custom package environment modifications

Custom package modifications can conflict with the default PATH changes we make (specifically for the PATH, CMAKE_PREFIX_PATH, and PKG_CONFIG_PATH environment variables), so this applies changes in a fixed order:

  • All modifications (custom and default) from external deps first

  • All modifications from non-external deps afterwards

With that order, PrependPath actions from non-external default environment modifications will take precedence over custom modifications from external packages.

A secondary constraint is that custom and default modifications are grouped on a per-package basis: combined with the post-order traversal this means that default modifications of dependents can override custom modifications of dependencies (again, this would only occur for PATH, CMAKE_PREFIX_PATH, or PKG_CONFIG_PATH).

Parameters
  • spec (spack.spec.Spec) – spec for which we want the modifications

  • context (str) – either ‘build’ for build-time modifications or ‘run’ for run-time modifications

spack.build_environment.parent_class_modules(cls)[source]

Get list of superclass modules that descend from spack.package.PackageBase

Includes cls.__module__

spack.build_environment.set_compiler_environment_variables(pkg, env)[source]
spack.build_environment.set_module_variables_for_package(pkg)[source]

Populate the module scope of install() with some useful functions. This makes things easier for package writers.

spack.build_environment.set_wrapper_variables(pkg, env)[source]

Set environment variables used by the Spack compiler wrapper (which have the prefix SPACK_) and also add the compiler wrappers to PATH.

This determines the injected -L/-I/-rpath options; each of these specifies a search order and this function computes these options in a manner that is intended to match the DAG traversal order in modifications_from_dependencies: that method uses a post-order traversal so that PrependPath actions from dependencies take lower precedence; we use a post-order traversal here to match the visitation order of modifications_from_dependencies (so we are visiting the lowest priority packages first).

spack.build_environment.setup_package(pkg, dirty, context='build')[source]

Execute all environment setup routines.

spack.build_environment.start_build_process(pkg, function, kwargs)[source]

Create a child process to do part of a spack build.

Parameters
  • pkg (spack.package.PackageBase) – package whose environment we should set up the child process for.

  • function (typing.Callable) – argless function to run in the child process.

Usage:

def child_fun():
    # do stuff
build_env.start_build_process(pkg, child_fun)

The child process is run with the build environment set up by spack.build_environment. This allows package authors to have full control over the environment, etc. without affecting other builds that might be executed in the same spack call.

If something goes wrong, the child process catches the error and passes it to the parent wrapped in a ChildError. The parent is expected to handle (or re-raise) the ChildError.

This uses multiprocessing.Process to create the child process. The mechanism used to create the process differs on different operating systems and for different versions of Python. In some cases “fork” is used (i.e. the “fork” system call) and some cases it starts an entirely new Python interpreter process (in the docs this is referred to as the “spawn” start method). Breaking it down by OS:

  • Linux always uses fork.

  • Mac OS uses fork before Python 3.8 and “spawn” for 3.8 and after.

  • Windows always uses the “spawn” start method.

For more information on multiprocessing child process creation mechanisms, see https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods

spack.build_environment.write_log_summary(out, log_type, log, last=None)[source]

spack.caches module

Caches used by Spack to store data

class spack.caches.MirrorCache(root, skip_unstable_versions)[source]

Bases: object

store(fetcher, relative_dest)[source]

Fetch and relocate the fetcher’s target into our mirror cache.

Symlink a human readible path in our mirror to the actual storage location.

spack.caches.fetch_cache = <spack.fetch_strategy.FsCache object>

Spack’s local cache for downloaded source archives

spack.caches.fetch_cache_location()[source]

Filesystem cache of downloaded archives.

This prevents Spack from repeatedly fetch the same files when building the same package different ways or multiple times.

spack.caches.misc_cache = <spack.util.file_cache.FileCache object>

Spack’s cache for small data

spack.caches.misc_cache_location()[source]

The misc_cache is Spack’s cache for small data.

Currently the misc_cache stores indexes for virtual dependency providers and for which packages provide which tags.

spack.ci module

class spack.ci.TemporaryDirectory[source]

Bases: object

spack.ci.can_sign_binaries()[source]
spack.ci.can_verify_binaries()[source]
spack.ci.compute_spec_deps(spec_list, check_index_only=False)[source]

Computes all the dependencies for the spec(s) and generates a JSON object which provides both a list of unique spec names as well as a comprehensive list of all the edges in the dependency graph. For example, given a single spec like ‘readline@7.0’, this function generates the following JSON object:

{
    "dependencies": [
        {
            "depends": "readline/ip6aiun",
            "spec": "readline/ip6aiun"
        },
        {
            "depends": "ncurses/y43rifz",
            "spec": "readline/ip6aiun"
        },
        {
            "depends": "ncurses/y43rifz",
            "spec": "readline/ip6aiun"
        },
        {
            "depends": "pkgconf/eg355zb",
            "spec": "ncurses/y43rifz"
        },
        {
            "depends": "pkgconf/eg355zb",
            "spec": "readline/ip6aiun"
        }
    ],
    "specs": [
        {
          "root_spec": "readline@7.0%apple-clang@9.1.0 arch=darwin-...",
          "spec": "readline@7.0%apple-clang@9.1.0 arch=darwin-highs...",
          "label": "readline/ip6aiun"
        },
        {
          "root_spec": "readline@7.0%apple-clang@9.1.0 arch=darwin-...",
          "spec": "ncurses@6.1%apple-clang@9.1.0 arch=darwin-highsi...",
          "label": "ncurses/y43rifz"
        },
        {
          "root_spec": "readline@7.0%apple-clang@9.1.0 arch=darwin-...",
          "spec": "pkgconf@1.5.4%apple-clang@9.1.0 arch=darwin-high...",
          "label": "pkgconf/eg355zb"
        }
    ]
}
spack.ci.configure_compilers(compiler_action, scope=None)[source]
spack.ci.copy_attributes(attrs_list, src_dict, dest_dict)[source]
spack.ci.copy_stage_logs_to_artifacts(job_spec, job_log_dir)[source]
spack.ci.download_and_extract_artifacts(url, work_dir)[source]
spack.ci.find_matching_config(spec, gitlab_ci)[source]
spack.ci.format_job_needs(phase_name, strip_compilers, dep_jobs, osname, build_group, prune_dag, stage_spec_dict, enable_artifacts_buildcache)[source]
spack.ci.format_root_spec(spec, main_phase, strip_compiler)[source]
spack.ci.generate_gitlab_ci_yaml(env, print_summary, output_file, prune_dag=False, check_index_only=False, run_optimizer=False, use_dependencies=False, artifacts_root=None)[source]
spack.ci.get_cdash_build_name(spec, build_group)[source]
spack.ci.get_concrete_specs(env, root_spec, job_name, related_builds, compiler_action)[source]
spack.ci.get_job_name(phase, strip_compiler, spec, osarch, build_group)[source]
spack.ci.get_spack_info()[source]
spack.ci.get_spec_dependencies(specs, deps, spec_labels, check_index_only=False)[source]
spack.ci.get_spec_string(spec)[source]
spack.ci.import_signing_key(base64_signing_key)[source]
spack.ci.is_main_phase(phase_name)[source]
spack.ci.pkg_name_from_spec_label(spec_label)[source]
spack.ci.populate_buildgroup(job_names, group_name, project, site, credentials, cdash_url)[source]
spack.ci.print_staging_summary(spec_labels, dependencies, stages)[source]
spack.ci.push_mirror_contents(env, spec, specfile_path, mirror_url, sign_binaries)[source]
spack.ci.read_cdashid_from_mirror(spec, mirror_url)[source]
spack.ci.register_cdash_build(build_name, base_url, project, site, track)[source]
spack.ci.relate_cdash_builds(spec_map, cdash_base_url, job_build_id, cdash_project, cdashids_mirror_urls)[source]
spack.ci.reproduce_ci_job(url, work_dir)[source]
spack.ci.setup_spack_repro_version(repro_dir, checkout_commit, merge_commit=None)[source]
spack.ci.spec_deps_key(s)[source]
spack.ci.spec_matches(spec, match_string)[source]
spack.ci.stage_spec_jobs(specs, check_index_only=False)[source]
Take a set of release specs and generate a list of “stages”, where the

jobs in any stage are dependent only on jobs in previous stages. This allows us to maximize build parallelism within the gitlab-ci framework.

Parameters
  • specs (Iterable) – Specs to build

  • check_index_only (bool) – Regardless of whether DAG pruning is enabled, all configured mirrors are searched to see if binaries for specs are up to date on those mirrors. This flag limits that search to the binary cache indices on those mirrors to speed the process up, even though there is no garantee the index is up to date.

Returns: A tuple of information objects describing the specs, dependencies

and stages:

spec_labels: A dictionary mapping the spec labels which are made of

(pkg-name/hash-prefix), to objects containing “rootSpec” and “spec” keys. The root spec is the spec of which this spec is a dependency and the spec is the formatted spec string for this spec.

deps: A dictionary where the keys should also have appeared as keys in

the spec_labels dictionary, and the values are the set of dependencies for that spec.

stages: An ordered list of sets, each of which contains all the jobs to

built in that stage. The jobs are expressed in the same format as the keys in the spec_labels and deps objects.

spack.ci.url_encode_string(input_string)[source]
spack.ci.write_cdashid_to_mirror(cdashid, spec, mirror_url)[source]

spack.ci_needs_workaround module

spack.ci_needs_workaround.convert_job(job_entry)[source]
spack.ci_needs_workaround.get_job_name(needs_entry)
spack.ci_needs_workaround.needs_to_dependencies(yaml)[source]

spack.ci_optimization module

spack.ci_optimization.add_extends(yaml, key)[source]

Modifies the given object “yaml” so that it includes an “extends” key whose value features “key”.

If “extends” is not in yaml, then yaml is modified such that yaml[“extends”] == key.

If yaml[“extends”] is a str, then yaml is modified such that yaml[“extends”] == [yaml[“extends”], key]

If yaml[“extends”] is a list that does not include key, then key is appended to the list.

Otherwise, yaml is left unchanged.

spack.ci_optimization.build_histogram(iterator, key)[source]

Builds a histogram of values given an iterable of mappings and a key.

For each mapping “m” with key “key” in iterator, the value m[key] is considered.

Returns a list of tuples (hash, count, proportion, value), where

  • “hash” is a sha1sum hash of the value.

  • “count” is the number of occurences of values that hash to “hash”.

  • “proportion” is the proportion of all values considered above that hash to “hash”.

  • “value” is one of the values considered above that hash to “hash”. Which value is chosen when multiple values hash to the same “hash” is undefined.

The list is sorted in descending order by count, yielding the most frequently occuring hashes first.

spack.ci_optimization.common_subobject(yaml, sub)[source]

Factor prototype object “sub” out of the values of mapping “yaml”.

Consider a modified copy of yaml, “new”, where for each key, “key” in yaml:

  • If yaml[key] matches sub, then new[key] = subkeys(yaml[key], sub).

  • Otherwise, new[key] = yaml[key].

If the above match criteria is not satisfied for any such key, then (yaml, None) is returned. The yaml object is returned unchanged.

Otherwise, each matching value in new is modified as in add_extends(new[key], common_key), and then new[common_key] is set to sub. The common_key value is chosen such that it does not match any preexisting key in new. In this case, (new, common_key) is returned.

spack.ci_optimization.matches(obj, proto)[source]

Returns True if the test object “obj” matches the prototype object “proto”.

If obj and proto are mappings, obj matches proto if (key in obj) and (obj[key] matches proto[key]) for every key in proto.

If obj and proto are sequences, obj matches proto if they are of the same length and (a matches b) for every (a,b) in zip(obj, proto).

Otherwise, obj matches proto if obj == proto.

Precondition: proto must not have any reference cycles

spack.ci_optimization.optimizer(yaml)[source]
spack.ci_optimization.print_delta(name, old, new, applied=None)[source]
spack.ci_optimization.sort_yaml_obj(obj)[source]
spack.ci_optimization.subkeys(obj, proto)[source]

Returns the test mapping “obj” after factoring out the items it has in common with the prototype mapping “proto”.

Consider a recursive merge operation, merge(a, b) on mappings a and b, that returns a mapping, m, whose keys are the union of the keys of a and b, and for every such key, “k”, its corresponding value is:

  • merge(a[key], b[key]) if a[key] and b[key] are mappings, or

  • b[key] if (key in b) and not matches(a[key], b[key]),

    or

  • a[key] otherwise

If obj and proto are mappings, the returned object is the smallest object, “a”, such that merge(a, proto) matches obj.

Otherwise, obj is returned.

spack.ci_optimization.try_optimization_pass(name, yaml, optimization_pass, *args, **kwargs)[source]

Try applying an optimization pass and return information about the result

“name” is a string describing the nature of the pass. If it is a non-empty string, summary statistics are also printed to stdout.

“yaml” is the object to apply the pass to.

“optimization_pass” is the function implementing the pass to be applied.

“args” and “kwargs” are the additional arguments to pass to optimization pass. The pass is applied as

>>> (new_yaml, *other_results) = optimization_pass(yaml, *args, **kwargs)

The pass’s results are greedily rejected if it does not modify the original yaml document, or if it produces a yaml document that serializes to a larger string.

Returns (new_yaml, yaml, applied, other_results) if applied, or (yaml, new_yaml, applied, other_results) otherwise.

spack.compiler module

class spack.compiler.Compiler(cspec, operating_system, target, paths, modules=None, alias=None, environment=None, extra_rpaths=None, enable_implicit_rpaths=None, **kwargs)[source]

Bases: object

This class encapsulates a Spack “compiler”, which includes C, C++, and Fortran compilers. Subclasses should implement support for specific compilers, their possible names, arguments, and how to identify the particular type of compiler.

PrgEnv = None
PrgEnv_compiler = None
property c11_flag
property c99_flag
cc_names = []
property cc_pic_flag

Returns the flag used by the C compiler to produce Position Independent Code (PIC).

property cc_rpath_arg
classmethod cc_version(cc)[source]
property cxx11_flag
property cxx14_flag
property cxx17_flag
property cxx98_flag
cxx_names = []
property cxx_pic_flag

Returns the flag used by the C++ compiler to produce Position Independent Code (PIC).

property cxx_rpath_arg
classmethod cxx_version(cxx)[source]
property debug_flags
classmethod default_version(cc)[source]

Override just this to override all compiler version functions.

property disable_new_dtags
property enable_new_dtags
classmethod extract_version_from_output(output)[source]

Extracts the version from compiler’s output.

f77_names = []
property f77_pic_flag

Returns the flag used by the F77 compiler to produce Position Independent Code (PIC).

property f77_rpath_arg
classmethod f77_version(f77)[source]
fc_names = []
property fc_pic_flag

Returns the flag used by the FC compiler to produce Position Independent Code (PIC).

property fc_rpath_arg
classmethod fc_version(fc)[source]
get_real_version()[source]

Query the compiler for its version.

This is the “real” compiler version, regardless of what is in the compilers.yaml file, which the user can change to name their compiler.

Use the runtime environment of the compiler (modules and environment modifications) to enable the compiler to run properly on any platform.

ignore_version_errors = ()

Return values to ignore when invoking the compiler to get its version

implicit_rpaths()[source]
property linker_arg

Flag that need to be used to pass an argument to the linker.

property openmp_flag
property opt_flags
prefixes = []
property real_version

Executable reported compiler version used for API-determinations

E.g. C++11 flag checks.

property required_libs

For executables created with this compiler, the compiler libraries that would be generally required to run it.

classmethod search_regexps(language)[source]
setup_custom_environment(pkg, env)[source]

Set any environment variables necessary to use the compiler.

suffixes = ['-.*']
property verbose_flag

This property should be overridden in the compiler subclass if a verbose flag is available.

If it is not overridden, it is assumed to not be supported.

verify_executables()[source]

Raise an error if any of the compiler executables is not valid.

This method confirms that for all of the compilers (cc, cxx, f77, fc) that have paths, those paths exist and are executable by the current user. Raises a CompilerAccessError if any of the non-null paths for the compiler are not accessible.

property version
version_argument = '-dumpversion'

Compiler argument that produces version information

version_regex = '(.*)'

Regex used to extract version from compiler’s output

spack.concretize module

Functions here are used to take abstract specs and make them concrete. For example, if a spec asks for a version between 1.8 and 1.9, these functions might take will take the most recent 1.9 version of the package available. Or, if the user didn’t specify a compiler for a spec, then this will assign a compiler to the spec based on defaults or user preferences.

TODO: make this customizable and allow users to configure

concretization policies.

class spack.concretize.Concretizer(abstract_spec=None)[source]

Bases: object

You can subclass this class to override some of the default concretization strategies, or you can override all of them.

adjust_target(spec)[source]

Adjusts the target microarchitecture if the compiler is too old to support the default one.

Parameters

spec – spec to be concretized

Returns

True if spec was modified, False otherwise

check_for_compiler_existence = None

Controls whether we check that compiler versions actually exist during concretization. Used for testing and for mirror creation

choose_virtual_or_external(spec)[source]

Given a list of candidate virtual and external packages, try to find one that is most ABI compatible.

concretize_architecture(spec)[source]

If the spec is empty provide the defaults of the platform. If the architecture is not a string type, then check if either the platform, target or operating system are concretized. If any of the fields are changed then return True. If everything is concretized (i.e the architecture attribute is a namedtuple of classes) then return False. If the target is a string type, then convert the string into a concretized architecture. If it has no architecture and the root of the DAG has an architecture, then use the root otherwise use the defaults on the platform.

concretize_compiler(spec)[source]

If the spec already has a compiler, we’re done. If not, then take the compiler used for the nearest ancestor with a compiler spec and use that. If the ancestor’s compiler is not concrete, then used the preferred compiler as specified in spackconfig.

Intuition: Use the spackconfig default if no package that depends on this one has a strict compiler requirement. Otherwise, try to build with the compiler that will be used by libraries that link to this one, to maximize compatibility.

concretize_compiler_flags(spec)[source]

The compiler flags are updated to match those of the spec whose compiler is used, defaulting to no compiler flags in the spec. Default specs set at the compiler level will still be added later.

concretize_develop(spec)[source]

Add dev_path=* variant to packages built from local source.

concretize_variants(spec)[source]

If the spec already has variants filled in, return. Otherwise, add the user preferences from packages.yaml or the default variants from the package specification.

concretize_version(spec)[source]

If the spec is already concrete, return. Otherwise take the preferred version from spackconfig, and default to the package’s version if there are no available versions.

TODO: In many cases we probably want to look for installed

versions of each package and use an installed version if we can link to it. The policy implemented here will tend to rebuild a lot of stuff becasue it will prefer a compiler in the spec to any compiler already- installed things were built with. There is likely some better policy that finds some middle ground between these two extremes.

target_from_package_preferences(spec)[source]

Returns the preferred target from the package preferences if there’s any.

Parameters

spec – abstract spec to be concretized

exception spack.concretize.InsufficientArchitectureInfoError(spec, archs)[source]

Bases: spack.error.SpackError

Raised when details on architecture cannot be collected from the system

exception spack.concretize.NoBuildError(spec)[source]

Bases: spack.error.SpecError

Raised when a package is configured with the buildable option False, but no satisfactory external versions can be found

exception spack.concretize.NoCompilersForArchError(arch, available_os_targets)[source]

Bases: spack.error.SpackError

exception spack.concretize.NoValidVersionError(spec)[source]

Bases: spack.error.SpackError

Raised when there is no way to have a concrete version for a particular spec.

exception spack.concretize.UnavailableCompilerVersionError(compiler_spec, arch=None)[source]

Bases: spack.error.SpackError

Raised when there is no available compiler that satisfies a compiler spec.

spack.concretize.concretize_specs_together(*abstract_specs, **kwargs)[source]

Given a number of specs as input, tries to concretize them together.

Parameters
  • tests (bool or list or set) – False to run no tests, True to test all packages, or a list of package names to run tests for some

  • *abstract_specs – abstract specs to be concretized, given either as Specs or strings

Returns

List of concretized specs

spack.concretize.disable_compiler_existence_check()[source]
spack.concretize.enable_compiler_existence_check()[source]
spack.concretize.find_spec(spec, condition, default=None)[source]

Searches the dag from spec in an intelligent order and looks for a spec that matches a condition

class spack.concretize.reverse_order(value)[source]

Bases: object

Helper for creating key functions.

This is a wrapper that inverts the sense of the natural comparisons on the object.

spack.config module

This module implements Spack’s configuration file handling.

This implements Spack’s configuration system, which handles merging multiple scopes with different levels of precedence. See the documentation on Configuration Scopes for details on how Spack’s configuration system behaves. The scopes are:

  1. default

  2. system

  3. site

  4. user

And corresponding per-platform scopes. Important functions in this module are:

get_config reads in YAML data for a particular scope and returns it. Callers can then modify the data and write it back with update_config.

When read in, Spack validates configurations with jsonschemas. The schemas are in submodules of spack.schema.

exception spack.config.ConfigError(message, long_message=None)[source]

Bases: spack.error.SpackError

Superclass for all Spack config related errors.

exception spack.config.ConfigFileError(message, long_message=None)[source]

Bases: spack.config.ConfigError

Issue reading or accessing a configuration file.

exception spack.config.ConfigFormatError(validation_error, data, filename=None, line=None)[source]

Bases: spack.config.ConfigError

Raised when a configuration format does not match its schema.

class spack.config.ConfigScope(name, path)[source]

Bases: object

This class represents a configuration scope.

A scope is one directory containing named configuration files. Each file is a config “section” (e.g., mirrors, compilers, etc).

clear()[source]

Empty cached config information.

get_section(section)[source]
get_section_filename(section)[source]
property is_platform_dependent
exception spack.config.ConfigSectionError(message, long_message=None)[source]

Bases: spack.config.ConfigError

Error for referring to a bad config section name in a configuration.

class spack.config.Configuration(*scopes)[source]

Bases: object

A full Spack configuration, from a hierarchy of config files.

This class makes it easy to add a new scope on top of an existing one.

clear_caches()[source]

Clears the caches for configuration files,

This will cause files to be re-read upon the next request.

property file_scopes

List of writable scopes with an associated file.

get(path, default=None, scope=None)[source]

Get a config section or a single value from one.

Accepts a path syntax that allows us to grab nested config map entries. Getting the ‘config’ section would look like:

spack.config.get('config')

and the dirty section in the config scope would be:

spack.config.get('config:dirty')

We use : as the separator, like YAML objects.

get_config(section, scope=None)[source]

Get configuration settings for a section.

If scope is None or not provided, return the merged contents of all of Spack’s configuration scopes. If scope is provided, return only the configuration as specified in that scope.

This off the top-level name from the YAML section. That is, for a YAML config file that looks like this:

config:
  install_tree:
    root: $spack/opt/spack
  build_stage:
  - $tmpdir/$user/spack-stage

get_config('config') will return:

{ 'install_tree': {
      'root': '$spack/opt/spack',
  }
  'build_stage': ['$tmpdir/$user/spack-stage']
}
get_config_filename(scope, section)[source]

For some scope and section, get the name of the configuration file.

highest_precedence_non_platform_scope()[source]

Non-internal non-platform scope with highest precedence

Platform-specific scopes are of the form scope/platform

highest_precedence_scope()[source]

Non-internal scope with highest precedence.

matching_scopes(reg_expr)[source]

List of all scopes whose names match the provided regular expression.

For example, matching_scopes(r’^command’) will return all scopes whose names begin with command.

pop_scope()[source]

Remove the highest precedence scope and return it.

print_section(section, blame=False)[source]

Print a configuration to stdout.

push_scope(scope)[source]

Add a higher precedence scope to the Configuration.

remove_scope(scope_name)[source]

Remove scope by name; has no effect when scope_name does not exist

set(path, value, scope=None)[source]

Convenience function for setting single values in config files.

Accepts the path syntax described in get().

update_config(section, update_data, scope=None, force=False)[source]

Update the configuration file for a particular scope.

Overwrites contents of a section in a scope with update_data, then writes out the config file.

update_data should have the top-level section name stripped off (it will be re-added). Data itself can be a list, dict, or any other yaml-ish structure.

Configuration scopes that are still written in an old schema format will fail to update unless force is True.

Parameters
  • section (str) – section of the configuration to be updated

  • update_data (dict) – data to be used for the update

  • scope (str) – scope to be updated

  • force (str) – force the update

class spack.config.ImmutableConfigScope(name, path)[source]

Bases: spack.config.ConfigScope

A configuration scope that cannot be written to.

This is used for ConfigScopes passed on the command line.

class spack.config.InternalConfigScope(name, data=None)[source]

Bases: spack.config.ConfigScope

An internal configuration scope that is not persisted to a file.

This is for spack internal use so that command-line options and config file settings are accessed the same way, and Spack can easily override settings from files.

clear()[source]

Empty cached config information.

get_section(section)[source]

Just reads from an internal dictionary.

get_section_filename(section)[source]
class spack.config.SingleFileScope(name, path, schema, yaml_path=None)[source]

Bases: spack.config.ConfigScope

This class represents a configuration scope in a single YAML file.

get_section(section)[source]
get_section_filename(section)[source]
property is_platform_dependent
spack.config.add(fullpath, scope=None)[source]

Add the given configuration to the specified config scope. Add accepts a path. If you want to add from a filename, use add_from_file

spack.config.add_from_file(filename, scope=None)[source]

Add updates to a config from a filename

spack.config.command_line_scopes = []

configuration scopes added on the command line set by spack.main.main().

spack.config.config = <spack.config.Configuration object>

This is the singleton configuration instance for Spack.

spack.config.config_defaults = {'config': {'build_jobs': 2, 'build_stage': '$tempdir/spack-stage', 'checksum': True, 'concretizer': 'original', 'connect_timeout': 10, 'debug': False, 'dirty': False, 'verify_ssl': True}}

Hard-coded default values for some key configuration options. This ensures that Spack will still work even if config.yaml in the defaults scope is removed.

spack.config.configuration_defaults_path = ('defaults', '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/latest/lib/spack/docs/_spack_root/etc/spack/defaults')

Path to the default configuration

spack.config.default_list_scope()[source]

Return the config scope that is listed by default.

Commands that list configuration list all scopes (merged) by default.

spack.config.default_modify_scope(section='config')[source]

Return the config scope that commands should modify by default.

Commands that modify configuration by default modify the highest priority scope.

Parameters

section (bool) – Section for which to get the default scope. If this is not ‘compilers’, a general (non-platform) scope is used.

spack.config.ensure_latest_format_fn(section)[source]

Return a function that takes as input a dictionary read from a configuration file and update it to the latest format.

The function returns True if there was any update, False otherwise.

Parameters

section (str) – section of the configuration e.g. “packages”, “config”, etc.

spack.config.first_existing(dictionary, keys)[source]

Get the value of the first key in keys that is in the dictionary.

spack.config.get(path, default=None, scope=None)[source]

Module-level wrapper for Configuration.get().

spack.config.get_valid_type(path)[source]

Returns an instance of a type that will pass validation for path.

The instance is created by calling the constructor with no arguments. If multiple types will satisfy validation for data at the configuration path given, the priority order is list, dict, str, bool, int, float.

spack.config.merge_yaml(dest, source)[source]

Merges source into dest; entries in source take precedence over dest.

This routine may modify dest and should be assigned to dest, in case dest was None to begin with, e.g.:

dest = merge_yaml(dest, source)

In the result, elements from lists from source will appear before elements of lists from dest. Likewise, when iterating over keys or items in merged OrderedDict objects, keys from source will appear before keys from dest.

Config file authors can optionally end any attribute in a dict with :: instead of :, and the key will override that of the parent instead of merging.

spack.config.override(path_or_scope, value=None)[source]

Simple way to override config settings within a context.

Parameters
  • path_or_scope (ConfigScope or str) – scope or single option to override

  • value (object or None) – value for the single option

Temporarily push a scope on the current configuration, then remove it after the context completes. If a single option is provided, create an internal config scope for it and push/pop that scope.

spack.config.overrides_base_name = 'overrides-'

Base name for the (internal) overrides scope.

spack.config.process_config_path(path)[source]
spack.config.read_config_file(filename, schema=None)[source]

Read a YAML configuration file.

User can provide a schema for validation. If no schema is provided, we will infer the schema from the top-level key.

spack.config.scopes()[source]

Convenience function to get list of configuration scopes.

spack.config.scopes_metavar = '{defaults,system,site,user}[/PLATFORM]'

metavar to use for commands that accept scopes this is shorter and more readable than listing all choices

spack.config.section_schemas = {'bootstrap': {'$schema': 'http://json-schema.org/schema#', 'additionalProperties': False, 'properties': {'bootstrap': {'properties': {'enable': {'type': 'boolean'}, 'root': {'type': 'string'}, 'sources': {'items': {'additionalProperties': False, 'properties': {'description': {'type': 'string'}, 'info': {'type': 'object'}, 'name': {'type': 'string'}, 'type': {'type': 'string'}}, 'required': ['name', 'description', 'type'], 'type': 'object'}, 'type': 'array'}, 'trusted': {'patternProperties': {'\\w[\\w-]*': {'type': 'boolean'}}, 'type': 'object'}}, 'type': 'object'}}, 'title': 'Spack bootstrap configuration file schema', 'type': 'object'}, 'compilers': {'$schema': 'http://json-schema.org/schema#', 'additionalProperties': False, 'properties': {'compilers': {'items': [{'type': 'object', 'additionalProperties': False, 'properties': {'compiler': {'type': 'object', 'additionalProperties': False, 'required': ['paths', 'spec', 'modules', 'operating_system'], 'properties': {'paths': {'type': 'object', 'required': ['cc', 'cxx', 'f77', 'fc'], 'additionalProperties': False, 'properties': {'cc': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'cxx': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'f77': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'fc': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}}}, 'flags': {'type': 'object', 'additionalProperties': False, 'properties': {'cflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'cxxflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'fflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'cppflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'ldflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'ldlibs': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}}}, 'spec': {'type': 'string'}, 'operating_system': {'type': 'string'}, 'target': {'type': 'string'}, 'alias': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'modules': {'anyOf': [{'type': 'string'}, {'type': 'null'}, {'type': 'array'}]}, 'implicit_rpaths': {'anyOf': [{'type': 'array', 'items': {'type': 'string'}}, {'type': 'boolean'}]}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}, 'extra_rpaths': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}}}], 'type': 'array'}}, 'title': 'Spack compiler configuration file schema', 'type': 'object'}, 'config': {'$schema': 'http://json-schema.org/schema#', 'additionalProperties': False, 'properties': {'config': {'default': {}, 'properties': {'allow_sgid': {'type': 'boolean'}, 'binary_index_root': {'type': 'string'}, 'build_jobs': {'minimum': 1, 'type': 'integer'}, 'build_language': {'type': 'string'}, 'build_stage': {'oneOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}, 'ccache': {'type': 'boolean'}, 'checksum': {'type': 'boolean'}, 'concretizer': {'enum': ['original', 'clingo'], 'type': 'string'}, 'connect_timeout': {'minimum': 0, 'type': 'integer'}, 'db_lock_timeout': {'minimum': 1, 'type': 'integer'}, 'debug': {'type': 'boolean'}, 'deprecated': {'type': 'boolean'}, 'dirty': {'type': 'boolean'}, 'extensions': {'items': {'type': 'string'}, 'type': 'array'}, 'install_hash_length': {'minimum': 1, 'type': 'integer'}, 'install_missing_compilers': {'type': 'boolean'}, 'install_path_scheme': {'type': 'string'}, 'install_tree': {'anyOf': [{'type': 'object', 'properties': {'root': {'type': 'string'}, 'padded_length': {'oneOf': [{'type': 'integer', 'minimum': 0}, {'type': 'boolean'}]}, 'projections': {'type': 'object', 'patternProperties': {'all|\\w[\\w-]*': {'type': 'string'}}}}}, {'type': 'string'}]}, 'locks': {'type': 'boolean'}, 'misc_cache': {'type': 'string'}, 'module_roots': {'additionalProperties': False, 'deprecatedProperties': {'error': False, 'message': 'specifying a "dotkit" module root has no effect [support for "dotkit" has been dropped in v0.13.0]', 'properties': ['dotkit']}, 'properties': {'dotkit': {'type': 'string'}, 'lmod': {'type': 'string'}, 'tcl': {'type': 'string'}}, 'type': 'object'}, 'package_lock_timeout': {'anyOf': [{'type': 'integer', 'minimum': 1}, {'type': 'null'}]}, 'shared_linking': {'enum': ['rpath', 'runpath'], 'type': 'string'}, 'source_cache': {'type': 'string'}, 'suppress_gpg_warnings': {'type': 'boolean'}, 'template_dirs': {'items': {'type': 'string'}, 'type': 'array'}, 'test_stage': {'type': 'string'}, 'url_fetch_method': {'enum': ['urllib', 'curl'], 'type': 'string'}, 'verify_ssl': {'type': 'boolean'}}, 'type': 'object'}}, 'title': 'Spack core configuration file schema', 'type': 'object'}, 'mirrors': {'$schema': 'http://json-schema.org/schema#', 'additionalProperties': False, 'properties': {'mirrors': {'additionalProperties': False, 'default': {}, 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'object', 'required': ['fetch', 'push'], 'properties': {'fetch': {'type': ['string', 'object']}, 'push': {'type': ['string', 'object']}}}]}}, 'type': 'object'}}, 'title': 'Spack mirror configuration file schema', 'type': 'object'}, 'modules': {'$schema': 'http://json-schema.org/schema#', 'additionalProperties': False, 'properties': {'modules': {'deprecatedProperties': {'error': False, 'message': 'the "dotkit" section in modules.yaml has no effect [support for "dotkit" has been dropped in v0.13.0]', 'properties': ['dotkit']}, 'patternProperties': {'(?!enable|lmod|tcl|dotkit|prefix_inspections)^\\w[\\w-]*': {'additionalProperties': False, 'default': {}, 'deprecatedProperties': {'error': False, 'message': 'the "dotkit" section in modules.yaml has no effect [support for "dotkit" has been dropped in v0.13.0]', 'properties': ['dotkit']}, 'properties': {'arch_folder': {'type': 'boolean'}, 'dotkit': {'allOf': [{'type': 'object', 'default': {}, 'allOf': [{'properties': {'verbose': {'type': 'boolean', 'default': False}, 'hash_length': {'type': 'integer', 'minimum': 0, 'default': 7}, 'whitelist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist_implicits': {'type': 'boolean', 'default': False}, 'defaults': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'naming_scheme': {'type': 'string'}, 'projections': {'type': 'object', 'patternProperties': {'all|\\w[\\w-]*': {'type': 'string'}}}, 'all': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}, {'validate_spec': True, 'patternProperties': {'(?!hierarchy|core_specs|verbose|hash_length|whitelist|blacklist|projections|naming_scheme|core_compilers|all|defaults)(^\\w[\\w-]*)': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}, '^[\\^@%+~]': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}]}, {}]}, 'enable': {'default': [], 'deprecatedProperties': {'error': False, 'message': 'cannot enable "dotkit" in modules.yaml [support for "dotkit" has been dropped in v0.13.0]', 'properties': ['dotkit']}, 'items': {'enum': ['tcl', 'dotkit', 'lmod'], 'type': 'string'}, 'type': 'array'}, 'lmod': {'allOf': [{'type': 'object', 'default': {}, 'allOf': [{'properties': {'verbose': {'type': 'boolean', 'default': False}, 'hash_length': {'type': 'integer', 'minimum': 0, 'default': 7}, 'whitelist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist_implicits': {'type': 'boolean', 'default': False}, 'defaults': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'naming_scheme': {'type': 'string'}, 'projections': {'type': 'object', 'patternProperties': {'all|\\w[\\w-]*': {'type': 'string'}}}, 'all': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}, {'validate_spec': True, 'patternProperties': {'(?!hierarchy|core_specs|verbose|hash_length|whitelist|blacklist|projections|naming_scheme|core_compilers|all|defaults)(^\\w[\\w-]*)': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}, '^[\\^@%+~]': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}]}, {'type': 'object', 'properties': {'core_compilers': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'hierarchy': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'core_specs': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}]}, 'prefix_inspections': {'additionalProperties': False, 'patternProperties': {'^[\\w-]*': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}, 'roots': {'properties': {'lmod': {'type': 'string'}, 'tcl': {'type': 'string'}}, 'type': 'object'}, 'tcl': {'allOf': [{'type': 'object', 'default': {}, 'allOf': [{'properties': {'verbose': {'type': 'boolean', 'default': False}, 'hash_length': {'type': 'integer', 'minimum': 0, 'default': 7}, 'whitelist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist_implicits': {'type': 'boolean', 'default': False}, 'defaults': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'naming_scheme': {'type': 'string'}, 'projections': {'type': 'object', 'patternProperties': {'all|\\w[\\w-]*': {'type': 'string'}}}, 'all': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}, {'validate_spec': True, 'patternProperties': {'(?!hierarchy|core_specs|verbose|hash_length|whitelist|blacklist|projections|naming_scheme|core_compilers|all|defaults)(^\\w[\\w-]*)': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}, '^[\\^@%+~]': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}]}, {}]}, 'use_view': {'anyOf': [{'type': 'string'}, {'type': 'boolean'}]}}, 'type': 'object'}}, 'properties': {'arch_folder': {'type': 'boolean'}, 'dotkit': {'allOf': [{'type': 'object', 'default': {}, 'allOf': [{'properties': {'verbose': {'type': 'boolean', 'default': False}, 'hash_length': {'type': 'integer', 'minimum': 0, 'default': 7}, 'whitelist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist_implicits': {'type': 'boolean', 'default': False}, 'defaults': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'naming_scheme': {'type': 'string'}, 'projections': {'type': 'object', 'patternProperties': {'all|\\w[\\w-]*': {'type': 'string'}}}, 'all': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}, {'validate_spec': True, 'patternProperties': {'(?!hierarchy|core_specs|verbose|hash_length|whitelist|blacklist|projections|naming_scheme|core_compilers|all|defaults)(^\\w[\\w-]*)': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}, '^[\\^@%+~]': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}]}, {}]}, 'enable': {'default': [], 'deprecatedProperties': {'error': False, 'message': 'cannot enable "dotkit" in modules.yaml [support for "dotkit" has been dropped in v0.13.0]', 'properties': ['dotkit']}, 'items': {'enum': ['tcl', 'dotkit', 'lmod'], 'type': 'string'}, 'type': 'array'}, 'lmod': {'allOf': [{'type': 'object', 'default': {}, 'allOf': [{'properties': {'verbose': {'type': 'boolean', 'default': False}, 'hash_length': {'type': 'integer', 'minimum': 0, 'default': 7}, 'whitelist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist_implicits': {'type': 'boolean', 'default': False}, 'defaults': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'naming_scheme': {'type': 'string'}, 'projections': {'type': 'object', 'patternProperties': {'all|\\w[\\w-]*': {'type': 'string'}}}, 'all': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}, {'validate_spec': True, 'patternProperties': {'(?!hierarchy|core_specs|verbose|hash_length|whitelist|blacklist|projections|naming_scheme|core_compilers|all|defaults)(^\\w[\\w-]*)': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}, '^[\\^@%+~]': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}]}, {'type': 'object', 'properties': {'core_compilers': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'hierarchy': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'core_specs': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}]}, 'prefix_inspections': {'additionalProperties': False, 'patternProperties': {'^[\\w-]*': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}, 'roots': {'properties': {'lmod': {'type': 'string'}, 'tcl': {'type': 'string'}}, 'type': 'object'}, 'tcl': {'allOf': [{'type': 'object', 'default': {}, 'allOf': [{'properties': {'verbose': {'type': 'boolean', 'default': False}, 'hash_length': {'type': 'integer', 'minimum': 0, 'default': 7}, 'whitelist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist_implicits': {'type': 'boolean', 'default': False}, 'defaults': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'naming_scheme': {'type': 'string'}, 'projections': {'type': 'object', 'patternProperties': {'all|\\w[\\w-]*': {'type': 'string'}}}, 'all': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}, {'validate_spec': True, 'patternProperties': {'(?!hierarchy|core_specs|verbose|hash_length|whitelist|blacklist|projections|naming_scheme|core_compilers|all|defaults)(^\\w[\\w-]*)': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}, '^[\\^@%+~]': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}]}, {}]}, 'use_view': {'anyOf': [{'type': 'string'}, {'type': 'boolean'}]}}, 'type': 'object'}}, 'title': 'Spack module file configuration file schema', 'type': 'object'}, 'packages': {'$schema': 'http://json-schema.org/schema#', 'additionalProperties': False, 'properties': {'packages': {'additionalProperties': False, 'default': {}, 'patternProperties': {'\\w[\\w-]*': {'additionalProperties': False, 'default': {}, 'deprecatedProperties': {'error': False, 'message': <function deprecate_paths_and_modules>, 'properties': ['modules', 'paths']}, 'properties': {'buildable': {'default': True, 'type': 'boolean'}, 'compiler': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'externals': {'items': {'additionalProperties': True, 'properties': {'extra_attributes': {'type': 'object'}, 'modules': {'items': {'type': 'string'}, 'type': 'array'}, 'prefix': {'type': 'string'}, 'spec': {'type': 'string'}}, 'required': ['spec'], 'type': 'object'}, 'type': 'array'}, 'modules': {'type': 'object'}, 'paths': {'type': 'object'}, 'permissions': {'additionalProperties': False, 'properties': {'group': {'type': 'string'}, 'read': {'enum': ['user', 'group', 'world'], 'type': 'string'}, 'write': {'enum': ['user', 'group', 'world'], 'type': 'string'}}, 'type': 'object'}, 'providers': {'additionalProperties': False, 'default': {}, 'patternProperties': {'\\w[\\w-]*': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}, 'target': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'variants': {'oneOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}, 'version': {'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}, 'type': 'array'}}, 'type': 'object'}}, 'type': 'object'}}, 'title': 'Spack package configuration file schema', 'type': 'object'}, 'repos': {'$schema': 'http://json-schema.org/schema#', 'additionalProperties': False, 'properties': {'repos': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'title': 'Spack repository configuration file schema', 'type': 'object'}, 'upstreams': {'$schema': 'http://json-schema.org/schema#', 'additionalProperties': False, 'properties': {'upstreams': {'default': {}, 'patternProperties': {'\\w[\\w-]*': {'additionalProperties': False, 'default': {}, 'properties': {'install_tree': {'type': 'string'}, 'modules': {'properties': {'lmod': {'type': 'string'}, 'tcl': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}}, 'title': 'Spack core configuration file schema', 'type': 'object'}}

Dict from section names -> schema for that section

spack.config.set(path, value, scope=None)[source]

Convenience function for setting single values in config files.

Accepts the path syntax described in get().

spack.config.use_configuration(*scopes_or_paths)[source]

Use the configuration scopes passed as arguments within the context manager.

Parameters

*scopes_or_paths – scope objects or paths to be used

Returns

Configuration object associated with the scopes passed as arguments

spack.config.validate(data, schema, filename=None)[source]

Validate data read in from a Spack YAML file.

Parameters
  • data (dict or list) – data read from a Spack YAML file

  • schema (dict or list) – jsonschema to validate data

This leverages the line information (start_mark, end_mark) stored on Spack YAML structures.

spack.database module

Spack’s installation tracking database.

The database serves two purposes:

  1. It implements a cache on top of a potentially very large Spack directory hierarchy, speeding up many operations that would otherwise require filesystem access.

  2. It will allow us to track external installations as well as lost packages and their dependencies.

Prior to the implementation of this store, a directory layout served as the authoritative database of packages in Spack. This module provides a cache and a sanity checking mechanism for what is in the filesystem.

exception spack.database.CorruptDatabaseError(message, long_message=None)[source]

Bases: spack.error.SpackError

Raised when errors are found while reading the database.

class spack.database.Database(root, db_dir=None, upstream_dbs=None, is_upstream=False, enable_transaction_locking=True, record_fields=['spec', 'ref_count', 'path', 'installed', 'explicit', 'installation_time', 'deprecated_for'])[source]

Bases: object

Per-process lock objects for each install prefix.

activated_extensions_for(spec_like, *args, **kwargs)[source]
add(spec_like, *args, **kwargs)[source]
clear_all_failures()[source]

Force remove install failure tracking files.

clear_failure(spec, force=False)[source]

Remove any persistent and cached failure tracking for the spec.

see mark_failed().

Parameters
  • spec (spack.spec.Spec) – the spec whose failure indicators are being removed

  • force (bool) – True if the failure information should be cleared when a prefix failure lock exists for the file or False if the failure should not be cleared (e.g., it may be associated with a concurrent build)

db_for_spec_hash(hash_key)[source]
deprecate(spec_like, *args, **kwargs)[source]
deprecator(spec)[source]

Return the spec that the given spec is deprecated for, or None

get_by_hash(dag_hash, default=None, installed=<built-in function any>)[source]

Look up a spec by DAG hash, or by a DAG hash prefix.

Parameters
  • dag_hash (str) – hash (or hash prefix) to look up

  • default (object or None) – default value to return if dag_hash is not in the DB (default: None)

  • installed (bool or InstallStatus or typing.Iterable or None) – if True, includes only installed specs in the search; if False only missing specs, and if any, all specs in database. If an InstallStatus or iterable of InstallStatus, returns specs whose install status (installed, deprecated, or missing) matches (one of) the InstallStatus. (default: any)

installed defaults to any so that we can refer to any known hash. Note that query() and query_one() differ in that they only return installed specs by default.

Returns

a list of specs matching the hash or hash prefix

Return type

(list)

get_by_hash_local(*args, **kwargs)[source]

Look up a spec in this DB by DAG hash, or by a DAG hash prefix.

Parameters
  • dag_hash (str) – hash (or hash prefix) to look up

  • default (object or None) – default value to return if dag_hash is not in the DB (default: None)

  • installed (bool or InstallStatus or typing.Iterable or None) – if True, includes only installed specs in the search; if False only missing specs, and if any, all specs in database. If an InstallStatus or iterable of InstallStatus, returns specs whose install status (installed, deprecated, or missing) matches (one of) the InstallStatus. (default: any)

installed defaults to any so that we can refer to any known hash. Note that query() and query_one() differ in that they only return installed specs by default.

Returns

a list of specs matching the hash or hash prefix

Return type

(list)

get_record(spec_like, *args, **kwargs)[source]
installed_extensions_for(spec_like, *args, **kwargs)[source]
installed_relatives(spec_like, *args, **kwargs)[source]
is_occupied_install_prefix(path)[source]
mark(spec_like, *args, **kwargs)[source]
mark_failed(spec)[source]

Mark a spec as failing to install.

Prefix failure marking takes the form of a byte range lock on the nth byte of a file for coordinating between concurrent parallel build processes and a persistent file, named with the full hash and containing the spec, in a subdirectory of the database to enable persistence across overlapping but separate related build processes.

The failure lock file, spack.store.db.prefix_failures, lives alongside the install DB. n is the sys.maxsize-bit prefix of the associated DAG hash to make the likelihood of collision very low with no cleanup required.

missing(spec)[source]
prefix_failed(spec)[source]

Return True if the prefix (installation) is marked as failed.

prefix_failure_locked(spec)[source]

Return True if a process has a failure lock on the spec.

prefix_failure_marked(spec)[source]

Determine if the spec has a persistent failure marking.

prefix_lock(spec, timeout=None)[source]

Get a lock on a particular spec’s installation directory.

NOTE: The installation directory does not need to exist.

Prefix lock is a byte range lock on the nth byte of a file.

The lock file is spack.store.db.prefix_lock – the DB tells us what to call it and it lives alongside the install DB.

n is the sys.maxsize-bit prefix of the DAG hash. This makes likelihood of collision is very low AND it gives us readers-writer lock semantics with just a single lockfile, so no cleanup required.

prefix_read_lock(spec)[source]
prefix_write_lock(spec)[source]
query(*args, **kwargs)[source]

Query the Spack database including all upstream databases.

Parameters
  • query_spec – queries iterate through specs in the database and return those that satisfy the supplied query_spec. If query_spec is any, This will match all specs in the database. If it is a spec, we’ll evaluate spec.satisfies(query_spec)

  • known (bool or None) – Specs that are “known” are those for which Spack can locate a package.py file – i.e., Spack “knows” how to install them. Specs that are unknown may represent packages that existed in a previous version of Spack, but have since either changed their name or been removed

  • installed (bool or InstallStatus or typing.Iterable or None) – if True, includes only installed specs in the search; if False only missing specs, and if any, all specs in database. If an InstallStatus or iterable of InstallStatus, returns specs whose install status (installed, deprecated, or missing) matches (one of) the InstallStatus. (default: True)

  • explicit (bool or None) – A spec that was installed following a specific user request is marked as explicit. If instead it was pulled-in as a dependency of a user requested spec it’s considered implicit.

  • start_date (datetime.datetime or None) – filters the query discarding specs that have been installed before start_date.

  • end_date (datetime.datetime or None) – filters the query discarding specs that have been installed after end_date.

  • hashes (typing.Container) – list or set of hashes that we can use to restrict the search

  • in_buildcache (bool or None) – Specs that are marked in this database as part of an associated binary cache are in_buildcache. All other specs are not. This field is used for querying mirror indices. Default is any.

Returns

list of specs that match the query

query_by_spec_hash(hash_key, data=None)[source]
query_local(*args, **kwargs)[source]

Query only the local Spack database.

Parameters
  • query_spec – queries iterate through specs in the database and return those that satisfy the supplied query_spec. If query_spec is any, This will match all specs in the database. If it is a spec, we’ll evaluate spec.satisfies(query_spec)

  • known (bool or None) – Specs that are “known” are those for which Spack can locate a package.py file – i.e., Spack “knows” how to install them. Specs that are unknown may represent packages that existed in a previous version of Spack, but have since either changed their name or been removed

  • installed (bool or InstallStatus or typing.Iterable or None) – if True, includes only installed specs in the search; if False only missing specs, and if any, all specs in database. If an InstallStatus or iterable of InstallStatus, returns specs whose install status (installed, deprecated, or missing) matches (one of) the InstallStatus. (default: True)

  • explicit (bool or None) – A spec that was installed following a specific user request is marked as explicit. If instead it was pulled-in as a dependency of a user requested spec it’s considered implicit.

  • start_date (datetime.datetime or None) – filters the query discarding specs that have been installed before start_date.

  • end_date (datetime.datetime or None) – filters the query discarding specs that have been installed after end_date.

  • hashes (typing.Container) – list or set of hashes that we can use to restrict the search

  • in_buildcache (bool or None) – Specs that are marked in this database as part of an associated binary cache are in_buildcache. All other specs are not. This field is used for querying mirror indices. Default is any.

Returns

list of specs that match the query

query_one(query_spec, known=<built-in function any>, installed=True)[source]

Query for exactly one spec that matches the query spec.

Raises an assertion error if more than one spec matches the query. Returns None if no installed package matches.

read_transaction()[source]

Get a read lock context manager for use in a with block.

reindex(directory_layout)[source]

Build database index from scratch based on a directory layout.

Locks the DB if it isn’t locked already.

remove(spec_like, *args, **kwargs)[source]
specs_deprecated_by(spec)[source]

Return all specs deprecated in favor of the given spec

property unused_specs

Return all the specs that are currently installed but not needed at runtime to satisfy user’s requests.

Specs in the return list are those which are not either:
  1. Installed on an explicit user request

  2. Installed as a “run” or “link” dependency (even transitive) of a spec at point 1.

update_explicit(spec, explicit)[source]

Update the spec’s explicit state in the database.

Parameters
  • spec (spack.spec.Spec) – the spec whose install record is being updated

  • explicit (bool) – True if the package was requested explicitly by the user, False if it was pulled in as a dependency of an explicit package.

write_transaction()[source]

Get a write lock context manager for use in a with block.

class spack.database.ForbiddenLock[source]

Bases: object

exception spack.database.ForbiddenLockError(message, long_message=None)[source]

Bases: spack.error.SpackError

Raised when an upstream DB attempts to acquire a lock

class spack.database.InstallRecord(spec, path, installed, ref_count=0, explicit=False, installation_time=None, deprecated_for=None, in_buildcache=False)[source]

Bases: object

A record represents one installation in the DB.

The record keeps track of the spec for the installation, its install path, AND whether or not it is installed. We need the installed flag in case a user either:

  1. blew away a directory, or

  2. used spack uninstall -f to get rid of it

If, in either case, the package was removed but others still depend on it, we still need to track its spec, so we don’t actually remove from the database until a spec has no installed dependents left.

Parameters
  • spec (spack.spec.Spec) – spec tracked by the install record

  • path (str) – path where the spec has been installed

  • installed (bool) – whether or not the spec is currently installed

  • ref_count (int) – number of specs that depend on this one

  • explicit (bool or None) – whether or not this spec was explicitly installed, or pulled-in as a dependency of something else

  • installation_time (datetime.datetime or None) – time of the installation

classmethod from_dict(spec, dictionary)[source]
install_type_matches(installed)[source]
to_dict(include_fields=['spec', 'ref_count', 'path', 'installed', 'explicit', 'installation_time', 'deprecated_for'])[source]
class spack.database.InstallStatus[source]

Bases: str

class spack.database.InstallStatuses[source]

Bases: object

DEPRECATED = 'deprecated'
INSTALLED = 'installed'
MISSING = 'missing'
classmethod canonicalize(query_arg)[source]
exception spack.database.InvalidDatabaseVersionError(expected, found)[source]

Bases: spack.error.SpackError

exception spack.database.MissingDependenciesError(message, long_message=None)[source]

Bases: spack.error.SpackError

Raised when DB cannot find records for dependencies

exception spack.database.NonConcreteSpecAddError(message, long_message=None)[source]

Bases: spack.error.SpackError

Raised when attemptint to add non-concrete spec to DB.

exception spack.database.UpstreamDatabaseLockingError(message, long_message=None)[source]

Bases: spack.error.SpackError

Raised when an operation would need to lock an upstream database

spack.database.nullcontext(*args, **kwargs)[source]

spack.dependency module

Data structures that represent Spack’s dependency relationships.

class spack.dependency.Dependency(pkg, spec, type=('build', 'link'))[source]

Bases: object

Class representing metadata for a dependency on a package.

This class differs from spack.spec.DependencySpec because it represents metadata at the Package level. spack.spec.DependencySpec is a descriptor for an actual package configuration, while Dependency is a descriptor for a package’s dependency requirements.

A dependency is a requirement for a configuration of another package that satisfies a particular spec. The dependency can have types, which determine how that package configuration is required, e.g. whether it is required for building the package, whether it needs to be linked to, or whether it is needed at runtime so that Spack can call commands from it.

A package can also depend on another package with patches. This is for cases where the maintainers of one package also maintain special patches for their dependencies. If one package depends on another with patches, a special version of that dependency with patches applied will be built for use by the dependent package. The patches are included in the new version’s spec hash to differentiate it from unpatched versions of the same package, so that unpatched versions of the dependency package can coexist with the patched version.

merge(other)[source]

Merge constraints, deptypes, and patches of other into self.

property name

Get the name of the dependency package.

spack.dependency.all_deptypes = ('build', 'link', 'run', 'test')

The types of dependency relationships that Spack understands.

spack.dependency.canonical_deptype(deptype)[source]

Convert deptype to a canonical sorted tuple, or raise ValueError.

Parameters

deptype (str or list or tuple) – string representing dependency type, or a list/tuple of such strings. Can also be the builtin function all or the string ‘all’, which result in a tuple of all dependency types known to Spack.

spack.dependency.default_deptype = ('build', 'link')

Default dependency type if none is specified

spack.dependency.deptype_chars(*type_tuples)[source]

Create a string representing deptypes for many dependencies.

The string will be some subset of ‘blrt’, like ‘bl ‘, ‘b t’, or ‘ lr ‘ where each letter in ‘blrt’ stands for ‘build’, ‘link’, ‘run’, and ‘test’ (the dependency types).

For a single dependency, this just indicates that the dependency has the indicated deptypes. For a list of dependnecies, this shows whether ANY dpeendency in the list has the deptypes (so the deptypes are merged).

spack.directives module

This package contains directives that can be used within a package.

Directives are functions that can be called inside a package definition to modify the package, for example:

class OpenMpi(Package):

depends_on(“hwloc”) provides(“mpi”) …

provides and depends_on are spack directives.

The available directives are:

  • conflicts

  • depends_on

  • extends

  • patch

  • provides

  • resource

  • variant

  • version

exception spack.directives.DirectiveError(message, long_message=None)[source]

Bases: spack.error.SpackError

This is raised when something is wrong with a package directive.

class spack.directives.DirectiveMeta(name, bases, attr_dict)[source]

Bases: type

Flushes the directives that were temporarily stored in the staging area into the package.

static directive(dicts=None)[source]

Decorator for Spack directives.

Spack directives allow you to modify a package while it is being defined, e.g. to add version or dependency information. Directives are one of the key pieces of Spack’s package “language”, which is embedded in python.

Here’s an example directive:

@directive(dicts='versions')
version(pkg, ...):
    ...

This directive allows you write:

class Foo(Package):
    version(...)

The @directive decorator handles a couple things for you:

  1. Adds the class scope (pkg) as an initial parameter when called, like a class method would. This allows you to modify a package from within a directive, while the package is still being defined.

  2. It automatically adds a dictionary called “versions” to the package so that you can refer to pkg.versions.

The (dicts='versions') part ensures that ALL packages in Spack will have a versions attribute after they’re constructed, and that if no directive actually modified it, it will just be an empty dict.

This is just a modular way to add storage attributes to the Package class, and it’s how Spack gets information from the packages to the core.

static pop_from_context()[source]

Pop the last constraint from the context

static push_to_context(when_spec)[source]

Add a spec to the context constraints.

spack.directives.conflicts(conflict_spec, when=None, msg=None)[source]

Allows a package to define a conflict.

Currently, a “conflict” is a concretized configuration that is known to be non-valid. For example, a package that is known not to be buildable with intel compilers can declare:

conflicts('%intel')

To express the same constraint only when the ‘foo’ variant is activated:

conflicts('%intel', when='+foo')
Parameters
  • conflict_spec (spack.spec.Spec) – constraint defining the known conflict

  • when (spack.spec.Spec) – optional constraint that triggers the conflict

  • msg (str) – optional user defined message

spack.directives.depends_on(spec, when=None, type=('build', 'link'), patches=None)[source]

Creates a dict of deps with specs defining when they apply.

Parameters
  • spec (spack.spec.Spec or str) – the package and constraints depended on

  • when (spack.spec.Spec or str) – when the dependent satisfies this, it has the dependency represented by spec

  • type (str or tuple) – str or tuple of legal Spack deptypes

  • patches (typing.Callable or list) – single result of patch() directive, a str to be passed to patch, or a list of these

This directive is to be used inside a Package definition to declare that the package requires other packages to be built first. @see The section “Dependency specs” in the Spack Packaging Guide.

spack.directives.extends(spec, type=('build', 'run'), **kwargs)[source]

Same as depends_on, but allows symlinking into dependency’s prefix tree.

This is for Python and other language modules where the module needs to be installed into the prefix of the Python installation. Spack handles this by installing modules into their own prefix, but allowing ONE module version to be symlinked into a parent Python install at a time, using spack activate.

keyword arguments can be passed to extends() so that extension packages can pass parameters to the extendee’s extension mechanism.

spack.directives.patch(url_or_filename, level=1, when=None, working_dir='.', **kwargs)[source]

Packages can declare patches to apply to source. You can optionally provide a when spec to indicate that a particular patch should only be applied when the package’s spec meets certain conditions (e.g. a particular version).

Parameters
  • url_or_filename (str) – url or relative filename of the patch

  • level (int) – patch level (as in the patch shell command)

  • when (spack.spec.Spec) – optional anonymous spec that specifies when to apply the patch

  • working_dir (str) – dir to change to before applying

Keyword Arguments
  • sha256 (str) – sha256 sum of the patch, used to verify the patch (only required for URL patches)

  • archive_sha256 (str) – sha256 sum of the archive, if the patch is compressed (only required for compressed URL patches)

spack.directives.provides(*specs, **kwargs)[source]

Allows packages to provide a virtual dependency. If a package provides ‘mpi’, other packages can declare that they depend on “mpi”, and spack can use the providing package to satisfy the dependency.

spack.directives.resource(**kwargs)[source]

Define an external resource to be fetched and staged when building the package. Based on the keywords present in the dictionary the appropriate FetchStrategy will be used for the resource. Resources are fetched and staged in their own folder inside spack stage area, and then moved into the stage area of the package that needs them.

List of recognized keywords:

  • ‘when’ : (optional) represents the condition upon which the resource is needed

  • ‘destination’ : (optional) path where to move the resource. This path must be relative to the main package stage area.

  • ‘placement’ : (optional) gives the possibility to fine tune how the resource is moved into the main package stage area.

spack.directives.variant(name, default=None, description='', values=None, multi=None, validator=None, when=None)[source]

Define a variant for the package. Packager can specify a default value as well as a text description.

Parameters
  • name (str) – name of the variant

  • default (str or bool) – default value for the variant, if not specified otherwise the default will be False for a boolean variant and ‘nothing’ for a multi-valued variant

  • description (str) – description of the purpose of the variant

  • values (tuple or typing.Callable) – either a tuple of strings containing the allowed values, or a callable accepting one value and returning True if it is valid

  • multi (bool) – if False only one value per spec is allowed for this variant

  • validator (typing.Callable) – optional group validator to enforce additional logic. It receives the package name, the variant name and a tuple of values and should raise an instance of SpackError if the group doesn’t meet the additional constraints

  • when (spack.spec.Spec, bool) – optional condition on which the variant applies

Raises

DirectiveError – if arguments passed to the directive are invalid

spack.directives.version(ver, checksum=None, **kwargs)[source]

Adds a version and, if appropriate, metadata for fetching its code.

The version directives are aggregated into a versions dictionary attribute with Version keys and metadata values, where the metadata is stored as a dictionary of kwargs.

The dict of arguments is turned into a valid fetch strategy for code packages later. See spack.fetch_strategy.for_package_version().

Keyword Arguments

deprecated (bool) – whether or not this version is deprecated

spack.directory_layout module

class spack.directory_layout.DirectoryLayout(root, **kwargs)[source]

Bases: object

A directory layout is used to associate unique paths with specs. Different installations are going to want different layouts for their install, and they can use this to customize the nesting structure of spack installs. The default layout is:

  • <install root>/

    • <platform-os-target>/

      • <compiler>-<compiler version>/

        • <name>-<version>-<hash>

The hash here is a SHA-1 hash for the full DAG plus the build spec.

The installation directory projections can be modified with the projections argument.

all_deprecated_specs()[source]
all_specs()[source]
build_packages_path(spec)[source]
check_installed(spec)[source]
create_install_directory(spec)[source]
deprecated_file_path(deprecated_spec, deprecator_spec=None)[source]

Gets full path to spec file for deprecated spec

If the deprecator_spec is provided, use that. Otherwise, assume deprecated_spec is already deprecated and its prefix links to the prefix of its deprecator.

disable_upstream_check()[source]
env_metadata_path(spec)[source]
property hidden_file_regexes
metadata_path(spec)[source]
path_for_spec(spec)[source]

Return absolute path from the root to a directory for the spec.

read_spec(path)[source]

Read the contents of a file and parse them as a spec

relative_path_for_spec(spec)[source]
remove_install_directory(spec, deprecated=False)[source]

Removes a prefix and any empty parent directories from the root. Raised RemoveFailedError if something goes wrong.

spec_file_path(spec)[source]

Gets full path to spec file

specs_by_hash()[source]
write_host_environment(spec)[source]

The host environment is a json file with os, kernel, and spack versioning. We use it in the case that an analysis later needs to easily access this information.

write_spec(spec, path)[source]

Write a spec out to a file.

exception spack.directory_layout.DirectoryLayoutError(message, long_msg=None)[source]

Bases: spack.error.SpackError

Superclass for directory layout errors.

exception spack.directory_layout.ExtensionAlreadyInstalledError(spec, ext_spec)[source]

Bases: spack.directory_layout.DirectoryLayoutError

Raised when an extension is added to a package that already has it.

exception spack.directory_layout.ExtensionConflictError(spec, ext_spec, conflict)[source]

Bases: spack.directory_layout.DirectoryLayoutError

Raised when an extension is added to a package that already has it.

class spack.directory_layout.ExtensionsLayout(view, **kwargs)[source]

Bases: object

A directory layout is used to associate unique paths with specs for package extensions. Keeps track of which extensions are activated for what package. Depending on the use case, this can mean globally activated extensions directly in the installation folder - or extensions activated in filesystem views.

add_extension(spec, ext_spec)[source]

Add to the list of currently installed extensions.

check_activated(spec, ext_spec)[source]

Ensure that ext_spec can be removed from spec.

If not, raise NoSuchExtensionError.

check_extension_conflict(spec, ext_spec)[source]

Ensure that ext_spec can be activated in spec.

If not, raise ExtensionAlreadyInstalledError or ExtensionConflictError.

extendee_target_directory(extendee)[source]

Specify to which full path extendee should link all files from extensions.

extension_map(spec)[source]

Get a dict of currently installed extension packages for a spec.

Dict maps { name : extension_spec } Modifying dict does not affect internals of this layout.

remove_extension(spec, ext_spec)[source]

Remove from the list of currently installed extensions.

exception spack.directory_layout.InconsistentInstallDirectoryError(message, long_msg=None)[source]

Bases: spack.directory_layout.DirectoryLayoutError

Raised when a package seems to be installed to the wrong place.

exception spack.directory_layout.InvalidDirectoryLayoutParametersError(message, long_msg=None)[source]

Bases: spack.directory_layout.DirectoryLayoutError

Raised when a invalid directory layout parameters are supplied

exception spack.directory_layout.InvalidExtensionSpecError(message, long_msg=None)[source]

Bases: spack.directory_layout.DirectoryLayoutError

Raised when an extension file has a bad spec in it.

exception spack.directory_layout.NoSuchExtensionError(spec, ext_spec)[source]

Bases: spack.directory_layout.DirectoryLayoutError

Raised when an extension isn’t there on deactivate.

exception spack.directory_layout.RemoveFailedError(installed_spec, prefix, error)[source]

Bases: spack.directory_layout.DirectoryLayoutError

Raised when a DirectoryLayout cannot remove an install prefix.

exception spack.directory_layout.SpecHashCollisionError(installed_spec, new_spec)[source]

Bases: spack.directory_layout.DirectoryLayoutError

Raised when there is a hash collision in an install layout.

exception spack.directory_layout.SpecReadError(message, long_msg=None)[source]

Bases: spack.directory_layout.DirectoryLayoutError

Raised when directory layout can’t read a spec.

class spack.directory_layout.YamlViewExtensionsLayout(view, layout)[source]

Bases: spack.directory_layout.ExtensionsLayout

Maintain extensions within a view.

add_extension(spec, ext_spec)[source]

Add to the list of currently installed extensions.

check_activated(spec, ext_spec)[source]

Ensure that ext_spec can be removed from spec.

If not, raise NoSuchExtensionError.

check_extension_conflict(spec, ext_spec)[source]

Ensure that ext_spec can be activated in spec.

If not, raise ExtensionAlreadyInstalledError or ExtensionConflictError.

extension_file_path(spec)[source]

Gets full path to an installed package’s extension file, which keeps track of all the extensions for that package which have been added to this view.

extension_map(spec)[source]

Defensive copying version of _extension_map() for external API.

remove_extension(spec, ext_spec)[source]

Remove from the list of currently installed extensions.

spack.error module

exception spack.error.NoHeadersError(message, long_message=None)[source]

Bases: spack.error.SpackError

Raised when package headers are requested but cannot be found

exception spack.error.NoLibrariesError(message_or_name, prefix=None)[source]

Bases: spack.error.SpackError

Raised when package libraries are requested but cannot be found

exception spack.error.SpackError(message, long_message=None)[source]

Bases: Exception

This is the superclass for all Spack errors. Subclasses can be found in the modules they have to do with.

die()[source]
property long_message
print_context()[source]

Print extended debug information about this exception.

This is usually printed when the top-level Spack error handler calls die(), but it can be called separately beforehand if a lower-level error handler needs to print error context and continue without raising the exception to the top level.

exception spack.error.SpecError(message, long_message=None)[source]

Bases: spack.error.SpackError

Superclass for all errors that occur while constructing specs.

exception spack.error.UnsatisfiableSpecError(provided, required=None, constraint_type=None, conflicts=None)[source]

Bases: spack.error.SpecError

Raised when a spec conflicts with package constraints.

For original concretizer, provide the requirement that was violated when raising.

exception spack.error.UnsupportedPlatformError(message)[source]

Bases: spack.error.SpackError

Raised by packages when a platform is not supported

spack.error.debug = False

whether we should write stack traces or short error messages this is module-scoped because it needs to be set very early

spack.extensions module

Service functions and classes to implement the hooks for Spack’s command extensions.

exception spack.extensions.CommandNotFoundError(cmd_name)[source]

Bases: spack.error.SpackError

Exception class thrown when a requested command is not recognized as such.

exception spack.extensions.ExtensionNamingError(path)[source]

Bases: spack.error.SpackError

Exception class thrown when a configured extension does not follow the expected naming convention.

spack.extensions.extension_name(path)[source]

Returns the name of the extension in the path passed as argument.

Parameters

path (str) – path where the extension resides

Returns

The extension name.

Raises

ExtensionNamingError – if path does not match the expected format for a Spack command extension.

spack.extensions.get_command_paths()[source]

Return the list of paths where to search for command files.

spack.extensions.get_module(cmd_name)[source]

Imports the extension module for a particular command name and returns it.

Parameters

cmd_name (str) – name of the command for which to get a module (contains -, not _).

spack.extensions.get_template_dirs()[source]

Returns the list of directories where to search for templates in extensions.

spack.extensions.load_command_extension(command, path)[source]

Loads a command extension from the path passed as argument.

Parameters
  • command (str) – name of the command (contains -, not _).

  • path (str) – base path of the command extension

Returns

A valid module if found and loadable; None if not found. Module

loading exceptions are passed through.

spack.extensions.path_for_extension(target_name, *paths)[source]

Return the test root dir for a given extension.

Parameters
  • target_name (str) – name of the extension to test

  • *paths – paths where the extensions reside

Returns

Root directory where tests should reside or None

spack.fetch_strategy module

Fetch strategies are used to download source code into a staging area in order to build it. They need to define the following methods:

  • fetch()

    This should attempt to download/check out source from somewhere.

  • check()

    Apply a checksum to the downloaded source code, e.g. for an archive. May not do anything if the fetch method was safe to begin with.

  • expand()

    Expand (e.g., an archive) downloaded file to source, with the standard stage source path as the destination directory.

  • reset()

    Restore original state of downloaded code. Used by clean commands. This may just remove the expanded source and re-expand an archive, or it may run something like git reset –hard.

  • archive()

    Archive a source directory, e.g. for creating a mirror.

class spack.fetch_strategy.BundleFetchStrategy(**kwargs)[source]

Bases: spack.fetch_strategy.FetchStrategy

Fetch strategy associated with bundle, or no-code, packages.

Having a basic fetch strategy is a requirement for executing post-install hooks. Consequently, this class provides the API but does little more than log messages.

TODO: Remove this class by refactoring resource handling and the link between composite stages and composite fetch strategies (see #11981).

property cachable

Report False as there is no code to cache.

fetch()[source]

Simply report success – there is no code to fetch.

mirror_id()[source]

BundlePackages don’t have a mirror id.

source_id()[source]

BundlePackages don’t have a source id.

url_attr = ''

There is no associated URL keyword in version() for no-code packages but this property is required for some strategy-related functions (e.g., check_pkg_attributes).

class spack.fetch_strategy.CacheURLFetchStrategy(url=None, checksum=None, **kwargs)[source]

Bases: spack.fetch_strategy.URLFetchStrategy

The resource associated with a cache URL may be out of date.

fetch()[source]

Fetch source code archive or repo.

Returns

True on success, False on failure.

Return type

bool

exception spack.fetch_strategy.ChecksumError(message, long_message=None)[source]

Bases: spack.fetch_strategy.FetchError

Raised when archive fails to checksum.

class spack.fetch_strategy.CvsFetchStrategy(**kwargs)[source]

Bases: spack.fetch_strategy.VCSFetchStrategy

Fetch strategy that gets source code from a CVS repository.

Use like this in a package:

version(‘name’,

cvs=’:pserver:anonymous@www.example.com:/cvsroot%module=modulename’)

Optionally, you can provide a branch and/or a date for the URL:

version(‘name’,

cvs=’:pserver:anonymous@www.example.com:/cvsroot%module=modulename’, branch=’branchname’, date=’date’)

Repositories are checked out into the standard stage source path directory.

archive(destination)[source]

Create an archive of the downloaded data for a mirror.

For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.

property cachable

Whether fetcher is capable of caching the resource it retrieves.

This generally is determined by whether the resource is identifiably associated with a specific package version.

Returns

True if can cache, False otherwise.

Return type

bool

property cvs
fetch()[source]

Fetch source code archive or repo.

Returns

True on success, False on failure.

Return type

bool

mirror_id()[source]

This is a unique ID for a source that is intended to help identify reuse of resources across packages.

It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.

optional_attrs = ['branch', 'date']
reset()[source]

Revert to freshly downloaded state.

For archive files, this may just re-expand the archive.

source_id()[source]

A unique ID for the source.

It is intended that a human could easily generate this themselves using the information available to them in the Spack package.

The returned value is added to the content which determines the full hash for a package using str().

url_attr = 'cvs'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

exception spack.fetch_strategy.ExtrapolationError(message, long_message=None)[source]

Bases: spack.fetch_strategy.FetchError

Raised when we can’t extrapolate a version for a package.

exception spack.fetch_strategy.FailedDownloadError(url, msg='')[source]

Bases: spack.fetch_strategy.FetchError

Raised when a download fails.

exception spack.fetch_strategy.FetchError(message, long_message=None)[source]

Bases: spack.error.SpackError

Superclass fo fetcher errors.

class spack.fetch_strategy.FetchStrategy(**kwargs)[source]

Bases: object

Superclass of all fetch strategies.

archive(destination)[source]

Create an archive of the downloaded data for a mirror.

For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.

property cachable

Whether fetcher is capable of caching the resource it retrieves.

This generally is determined by whether the resource is identifiably associated with a specific package version.

Returns

True if can cache, False otherwise.

Return type

bool

check()[source]

Checksum the archive fetched by this FetchStrategy.

expand()[source]

Expand the downloaded archive into the stage source path.

fetch()[source]

Fetch source code archive or repo.

Returns

True on success, False on failure.

Return type

bool

classmethod matches(args)[source]

Predicate that matches fetch strategies to arguments of the version directive.

Parameters

args – arguments of the version directive

mirror_id()[source]

This is a unique ID for a source that is intended to help identify reuse of resources across packages.

It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.

optional_attrs = []
reset()[source]

Revert to freshly downloaded state.

For archive files, this may just re-expand the archive.

source_id()[source]

A unique ID for the source.

It is intended that a human could easily generate this themselves using the information available to them in the Spack package.

The returned value is added to the content which determines the full hash for a package using str().

url_attr = None

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

class spack.fetch_strategy.FetchStrategyComposite[source]

Bases: spack.util.pattern.Composite

Composite for a FetchStrategy object.

classmethod matches(args)

Predicate that matches fetch strategies to arguments of the version directive.

Parameters

args – arguments of the version directive

source_id()[source]
exception spack.fetch_strategy.FetcherConflict(message, long_message=None)[source]

Bases: spack.fetch_strategy.FetchError

Raised for packages with invalid fetch attributes.

class spack.fetch_strategy.FsCache(root)[source]

Bases: object

destroy()[source]
fetcher(target_path, digest, **kwargs)[source]
store(fetcher, relative_dest)[source]
class spack.fetch_strategy.GCSFetchStrategy(*args, **kwargs)[source]

Bases: spack.fetch_strategy.URLFetchStrategy

FetchStrategy that pulls from a GCS bucket.

fetch()[source]

Fetch source code archive or repo.

Returns

True on success, False on failure.

Return type

bool

url_attr = 'gs'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

class spack.fetch_strategy.GitFetchStrategy(**kwargs)[source]

Bases: spack.fetch_strategy.VCSFetchStrategy

Fetch strategy that gets source code from a git repository. Use like this in a package:

version(‘name’, git=’https://github.com/project/repo.git’)

Optionally, you can provide a branch, or commit to check out, e.g.:

version(‘1.1’, git=’https://github.com/project/repo.git’, tag=’v1.1’)

You can use these three optional attributes in addition to git:

  • branch: Particular branch to build from (default is the

    repository’s default branch)

  • tag: Particular tag to check out

  • commit: Particular commit hash in the repo

Repositories are cloned into the standard stage source path directory.

archive(destination)[source]

Create an archive of the downloaded data for a mirror.

For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.

property cachable

Whether fetcher is capable of caching the resource it retrieves.

This generally is determined by whether the resource is identifiably associated with a specific package version.

Returns

True if can cache, False otherwise.

Return type

bool

clone(dest=None, commit=None, branch=None, tag=None, bare=False)[source]

Clone a repository to a path.

This method handles cloning from git, but does not require a stage.

Parameters
  • dest (str or None) – The path into which the code is cloned. If None, requires a stage and uses the stage’s source path.

  • commit (str or None) – A commit to fetch from the remote. Only one of commit, branch, and tag may be non-None.

  • branch (str or None) – A branch to fetch from the remote.

  • tag (str or None) – A tag to fetch from the remote.

  • bare (bool) – Execute a “bare” git clone (–bare option to git)

fetch()[source]

Fetch source code archive or repo.

Returns

True on success, False on failure.

Return type

bool

property git
property git_version
git_version_re = 'git version (\\S+)'
mirror_id()[source]

This is a unique ID for a source that is intended to help identify reuse of resources across packages.

It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.

optional_attrs = ['tag', 'branch', 'commit', 'submodules', 'get_full_repo', 'submodules_delete']
protocol_supports_shallow_clone()[source]

Shallow clone operations (–depth #) are not supported by the basic HTTP protocol or by no-protocol file specifications. Use (e.g.) https:// or file:// instead.

reset()[source]

Revert to freshly downloaded state.

For archive files, this may just re-expand the archive.

source_id()[source]

A unique ID for the source.

It is intended that a human could easily generate this themselves using the information available to them in the Spack package.

The returned value is added to the content which determines the full hash for a package using str().

url_attr = 'git'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

static version_from_git(git_exe)[source]

Given a git executable, return the Version (this will fail if the output cannot be parsed into a valid Version).

class spack.fetch_strategy.GoFetchStrategy(**kwargs)[source]

Bases: spack.fetch_strategy.VCSFetchStrategy

Fetch strategy that employs the go get infrastructure.

Use like this in a package:

version(‘name’,

go=’github.com/monochromegane/the_platinum_searcher/…’)

Go get does not natively support versions, they can be faked with git.

The fetched source will be moved to the standard stage sourcepath directory during the expand step.

archive(destination)[source]

Create an archive of the downloaded data for a mirror.

For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.

expand()[source]

Expand the downloaded archive into the stage source path.

fetch()[source]

Fetch source code archive or repo.

Returns

True on success, False on failure.

Return type

bool

property go
property go_version
reset()[source]

Revert to freshly downloaded state.

For archive files, this may just re-expand the archive.

url_attr = 'go'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

class spack.fetch_strategy.HgFetchStrategy(**kwargs)[source]

Bases: spack.fetch_strategy.VCSFetchStrategy

Fetch strategy that gets source code from a Mercurial repository. Use like this in a package:

version(‘name’, hg=’https://jay.grs.rwth-aachen.de/hg/lwm2’)

Optionally, you can provide a branch, or revision to check out, e.g.:

version(‘torus’,

hg=’https://jay.grs.rwth-aachen.de/hg/lwm2’, branch=’torus’)

You can use the optional ‘revision’ attribute to check out a branch, tag, or particular revision in hg. To prevent non-reproducible builds, using a moving target like a branch is discouraged.

  • revision: Particular revision, branch, or tag.

Repositories are cloned into the standard stage source path directory.

archive(destination)[source]

Create an archive of the downloaded data for a mirror.

For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.

property cachable

Whether fetcher is capable of caching the resource it retrieves.

This generally is determined by whether the resource is identifiably associated with a specific package version.

Returns

True if can cache, False otherwise.

Return type

bool

fetch()[source]

Fetch source code archive or repo.

Returns

True on success, False on failure.

Return type

bool

property hg

Returns: Executable: the hg executable

mirror_id()[source]

This is a unique ID for a source that is intended to help identify reuse of resources across packages.

It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.

optional_attrs = ['revision']
reset()[source]

Revert to freshly downloaded state.

For archive files, this may just re-expand the archive.

source_id()[source]

A unique ID for the source.

It is intended that a human could easily generate this themselves using the information available to them in the Spack package.

The returned value is added to the content which determines the full hash for a package using str().

url_attr = 'hg'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

exception spack.fetch_strategy.InvalidArgsError(pkg=None, version=None, **args)[source]

Bases: spack.fetch_strategy.FetchError

Raised when a version can’t be deduced from a set of arguments.

exception spack.fetch_strategy.NoArchiveFileError(message, long_message=None)[source]

Bases: spack.fetch_strategy.FetchError

“Raised when an archive file is expected but none exists.

exception spack.fetch_strategy.NoCacheError(message, long_message=None)[source]

Bases: spack.fetch_strategy.FetchError

Raised when there is no cached archive for a package.

exception spack.fetch_strategy.NoDigestError(message, long_message=None)[source]

Bases: spack.fetch_strategy.FetchError

Raised after attempt to checksum when URL has no digest.

exception spack.fetch_strategy.NoStageError(method)[source]

Bases: spack.fetch_strategy.FetchError

Raised when fetch operations are called before set_stage().

class spack.fetch_strategy.S3FetchStrategy(*args, **kwargs)[source]

Bases: spack.fetch_strategy.URLFetchStrategy

FetchStrategy that pulls from an S3 bucket.

fetch()[source]

Fetch source code archive or repo.

Returns

True on success, False on failure.

Return type

bool

url_attr = 's3'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

class spack.fetch_strategy.SvnFetchStrategy(**kwargs)[source]

Bases: spack.fetch_strategy.VCSFetchStrategy

Fetch strategy that gets source code from a subversion repository.

Use like this in a package:

version(‘name’, svn=’http://www.example.com/svn/trunk’)

Optionally, you can provide a revision for the URL:

version(‘name’, svn=’http://www.example.com/svn/trunk’,

revision=’1641’)

Repositories are checked out into the standard stage source path directory.

archive(destination)[source]

Create an archive of the downloaded data for a mirror.

For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.

property cachable

Whether fetcher is capable of caching the resource it retrieves.

This generally is determined by whether the resource is identifiably associated with a specific package version.

Returns

True if can cache, False otherwise.

Return type

bool

fetch()[source]

Fetch source code archive or repo.

Returns

True on success, False on failure.

Return type

bool

mirror_id()[source]

This is a unique ID for a source that is intended to help identify reuse of resources across packages.

It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.

optional_attrs = ['revision']
reset()[source]

Revert to freshly downloaded state.

For archive files, this may just re-expand the archive.

source_id()[source]

A unique ID for the source.

It is intended that a human could easily generate this themselves using the information available to them in the Spack package.

The returned value is added to the content which determines the full hash for a package using str().

property svn
url_attr = 'svn'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

class spack.fetch_strategy.URLFetchStrategy(url=None, checksum=None, **kwargs)[source]

Bases: spack.fetch_strategy.FetchStrategy

URLFetchStrategy pulls source code from a URL for an archive, check the archive against a checksum, and decompresses the archive.

The destination for the resulting file(s) is the standard stage path.

archive(destination)[source]

Just moves this archive to the destination.

property archive_file

Path to the source archive within this stage directory.

property cachable

Whether fetcher is capable of caching the resource it retrieves.

This generally is determined by whether the resource is identifiably associated with a specific package version.

Returns

True if can cache, False otherwise.

Return type

bool

property candidate_urls
check()[source]

Check the downloaded archive against a checksum digest. No-op if this stage checks code out of a repository.

property curl
expand()[source]

Expand the downloaded archive into the stage source path.

fetch()[source]

Fetch source code archive or repo.

Returns

True on success, False on failure.

Return type

bool

mirror_id()[source]

This is a unique ID for a source that is intended to help identify reuse of resources across packages.

It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.

optional_attrs = ['md5', 'sha1', 'sha224', 'sha256', 'sha384', 'sha512', 'checksum']
reset()[source]

Removes the source path if it exists, then re-expands the archive.

source_id()[source]

A unique ID for the source.

It is intended that a human could easily generate this themselves using the information available to them in the Spack package.

The returned value is added to the content which determines the full hash for a package using str().

url_attr = 'url'

The URL attribute must be specified either at the package class level, or as a keyword argument to version(). It is used to distinguish fetchers for different versions in the package DSL.

class spack.fetch_strategy.VCSFetchStrategy(**kwargs)[source]

Bases: spack.fetch_strategy.FetchStrategy

Superclass for version control system fetch strategies.

Like all fetchers, VCS fetchers are identified by the attributes passed to the version directive. The optional_attrs for a VCS fetch strategy represent types of revisions, e.g. tags, branches, commits, etc.

The required attributes (git, svn, etc.) are used to specify the URL and to distinguish a VCS fetch strategy from a URL fetch strategy.

archive(destination, **kwargs)[source]

Create an archive of the downloaded data for a mirror.

For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.

check()[source]

Checksum the archive fetched by this FetchStrategy.

expand()[source]

Expand the downloaded archive into the stage source path.

spack.fetch_strategy.all_strategies = [<class 'spack.fetch_strategy.BundleFetchStrategy'>, <class 'spack.fetch_strategy.URLFetchStrategy'>, <class 'spack.fetch_strategy.CacheURLFetchStrategy'>, <class 'spack.fetch_strategy.GoFetchStrategy'>, <class 'spack.fetch_strategy.GitFetchStrategy'>, <class 'spack.fetch_strategy.CvsFetchStrategy'>, <class 'spack.fetch_strategy.SvnFetchStrategy'>, <class 'spack.fetch_strategy.HgFetchStrategy'>, <class 'spack.fetch_strategy.S3FetchStrategy'>, <class 'spack.fetch_strategy.GCSFetchStrategy'>]

List of all fetch strategies, created by FetchStrategy metaclass.

spack.fetch_strategy.check_pkg_attributes(pkg)[source]

Find ambiguous top-level fetch attributes in a package.

Currently this only ensures that two or more VCS fetch strategies are not specified at once.

spack.fetch_strategy.fetcher(cls)[source]

Decorator used to register fetch strategies.

spack.fetch_strategy.for_package_version(pkg, version)[source]

Determine a fetch strategy based on the arguments supplied to version() in the package description.

spack.fetch_strategy.from_kwargs(**kwargs)[source]

Construct an appropriate FetchStrategy from the given keyword arguments.

Parameters

**kwargs – dictionary of keyword arguments, e.g. from a version() directive in a package.

Returns

The fetch strategy that matches the args, based

on attribute names (e.g., git, hg, etc.)

Return type

typing.Callable

Raises

FetchError – If no fetch_strategy matches the args.

spack.fetch_strategy.from_list_url(pkg)[source]

If a package provides a URL which lists URLs for resources by version, this can can create a fetcher for a URL discovered for the specified package’s version.

spack.fetch_strategy.from_url(url)[source]

Given a URL, find an appropriate fetch strategy for it. Currently just gives you a URLFetchStrategy that uses curl.

TODO: make this return appropriate fetch strategies for other

types of URLs.

spack.fetch_strategy.from_url_scheme(url, *args, **kwargs)[source]

Finds a suitable FetchStrategy by matching its url_attr with the scheme in the given url.

spack.fetch_strategy.stable_target(fetcher)[source]

Returns whether the fetcher target is expected to have a stable checksum. This is only true if the target is a preexisting archive file.

spack.fetch_strategy.warn_content_type_mismatch(subject, content_type='HTML')[source]

spack.filesystem_view module

class spack.filesystem_view.FilesystemView(root, layout, **kwargs)[source]

Bases: object

Governs a filesystem view that is located at certain root-directory.

Packages are linked from their install directories into a common file hierachy.

In distributed filesystems, loading each installed package seperately can lead to slow-downs due to too many directories being traversed. This can be circumvented by loading all needed modules into a common directory structure.

add_extension(spec)[source]

Add (link) an extension in this view. Does not add dependencies.

add_specs(*specs, **kwargs)[source]

Add given specs to view.

The supplied specs might be standalone packages or extensions of other packages.

Should accept with_dependencies as keyword argument (default True) to indicate wether or not dependencies should be activated as well.

Should except an exclude keyword argument containing a list of regexps that filter out matching spec names.

This method should make use of activate_{extension,standalone}.

add_standalone(spec)[source]

Add (link) a standalone package into this view.

check_added(spec)[source]

Check if the given concrete spec is active in this view.

get_all_specs()[source]

Get all specs currently active in this view.

get_projection_for_spec(spec)[source]

Get the projection in this view for a spec.

get_spec(spec)[source]

Return the actual spec linked in this view (i.e. do not look it up in the database by name).

spec can be a name or a spec from which the name is extracted.

As there can only be a single version active for any spec the name is enough to identify the spec in the view.

If no spec is present, returns None.

print_status(*specs, **kwargs)[source]
Print a short summary about the given specs, detailing whether..
  • ..they are active in the view.

  • ..they are active but the activated version differs.

  • ..they are not activte in the view.

Takes with_dependencies keyword argument so that the status of dependencies is printed as well.

remove_extension(spec)[source]

Remove (unlink) an extension from this view.

remove_specs(*specs, **kwargs)[source]

Removes given specs from view.

The supplied spec might be a standalone package or an extension of another package.

Should accept with_dependencies as keyword argument (default True) to indicate wether or not dependencies should be deactivated as well.

Should accept with_dependents as keyword argument (default True) to indicate wether or not dependents on the deactivated specs should be removed as well.

Should except an exclude keyword argument containing a list of regexps that filter out matching spec names.

This method should make use of deactivate_{extension,standalone}.

remove_standalone(spec)[source]

Remove (unlink) a standalone package from this view.

class spack.filesystem_view.YamlFilesystemView(root, layout, **kwargs)[source]

Bases: spack.filesystem_view.FilesystemView

Filesystem view to work with a yaml based directory layout.

add_extension(spec)[source]

Add (link) an extension in this view. Does not add dependencies.

add_specs(*specs, **kwargs)[source]

Add given specs to view.

The supplied specs might be standalone packages or extensions of other packages.

Should accept with_dependencies as keyword argument (default True) to indicate wether or not dependencies should be activated as well.

Should except an exclude keyword argument containing a list of regexps that filter out matching spec names.

This method should make use of activate_{extension,standalone}.

add_standalone(spec)[source]

Add (link) a standalone package into this view.

check_added(spec)[source]

Check if the given concrete spec is active in this view.

clean()[source]
get_all_specs()[source]

Get all specs currently active in this view.

get_conflicts(*specs)[source]

Return list of tuples (<spec>, <spec in view>) where the spec active in the view differs from the one to be activated.

get_path_meta_folder(spec)[source]

Get path to meta folder for either spec or spec name.

get_projection_for_spec(spec)[source]

Return the projection for a spec in this view.

Relies on the ordering of projections to avoid ambiguity.

get_spec(spec)[source]

Return the actual spec linked in this view (i.e. do not look it up in the database by name).

spec can be a name or a spec from which the name is extracted.

As there can only be a single version active for any spec the name is enough to identify the spec in the view.

If no spec is present, returns None.

merge(spec, ignore=None)[source]
print_conflict(spec_active, spec_specified, level='error')[source]

Singular print function for spec conflicts.

print_status(*specs, **kwargs)[source]
Print a short summary about the given specs, detailing whether..
  • ..they are active in the view.

  • ..they are active but the activated version differs.

  • ..they are not activte in the view.

Takes with_dependencies keyword argument so that the status of dependencies is printed as well.

read_projections()[source]
remove_extension(spec, with_dependents=True)[source]

Remove (unlink) an extension from this view.

remove_files(files)[source]
remove_specs(*specs, **kwargs)[source]

Removes given specs from view.

The supplied spec might be a standalone package or an extension of another package.

Should accept with_dependencies as keyword argument (default True) to indicate wether or not dependencies should be deactivated as well.

Should accept with_dependents as keyword argument (default True) to indicate wether or not dependents on the deactivated specs should be removed as well.

Should except an exclude keyword argument containing a list of regexps that filter out matching spec names.

This method should make use of deactivate_{extension,standalone}.

remove_standalone(spec)[source]

Remove (unlink) a standalone package from this view.

unmerge(spec, ignore=None)[source]
write_projections()[source]

spack.gcs_handler module

spack.gcs_handler.gcs_open(req, *args, **kwargs)[source]

Open a reader stream to a blob object on GCS

spack.graph module

Functions for graphing DAGs of dependencies.

This file contains code for graphing DAGs of software packages (i.e. Spack specs). There are two main functions you probably care about:

graph_ascii() will output a colored graph of a spec in ascii format, kind of like the graph git shows with “git log –graph”, e.g.:

o  mpileaks
|\
| |\
| o |  callpath
|/| |
| |\|
| |\ \
| | |\ \
| | | | o  adept-utils
| |_|_|/|
|/| | | |
o | | | |  mpi
 / / / /
| | o |  dyninst
| |/| |
|/|/| |
| | |/
| o |  libdwarf
|/ /
o |  libelf
 /
o  boost

graph_dot() will output a graph of a spec (or multiple specs) in dot format.

Note that graph_ascii assumes a single spec while graph_dot can take a number of specs as input.

class spack.graph.AsciiGraph[source]

Bases: object

write(spec, color=None, out=None)[source]

Write out an ascii graph of the provided spec.

Arguments: spec – spec to graph. This only handles one spec at a time.

Optional arguments:

out – file object to write out to (default is sys.stdout)

color – whether to write in color. Default is to autodetect

based on output file.

spack.graph.graph_ascii(spec, node='o', out=None, debug=False, indent=0, color=None, deptype='all')[source]
spack.graph.graph_dot(specs, deptype='all', static=False, out=None)[source]

Generate a graph in dot format of all provided specs.

Print out a dot formatted graph of all the dependencies between package. Output can be passed to graphviz, e.g.:

spack graph --dot qt | dot -Tpdf > spack-graph.pdf
spack.graph.topological_sort(spec, reverse=False, deptype='all')[source]

Topological sort for specs.

Return a list of dependency specs sorted topologically. The spec argument is not modified in the process.

spack.hash_types module

Definitions that control how Spack creates Spec hashes.

class spack.hash_types.SpecHashDescriptor(deptype, package_hash, name, override=None)[source]

Bases: object

This class defines how hashes are generated on Spec objects.

Spec hashes in Spack are generated from a serialized (e.g., with YAML) representation of the Spec graph. The representation may only include certain dependency types, and it may optionally include a canonicalized hash of the package.py for each node in the graph.

We currently use different hashes for different use cases.

property attr

Private attribute stored on spec

spack.hash_types.build_hash = <spack.hash_types.SpecHashDescriptor object>

Hash descriptor that includes build dependencies.

spack.hash_types.dag_hash = <spack.hash_types.SpecHashDescriptor object>

Default Hash descriptor, used by Spec.dag_hash() and stored in the DB.

spack.hash_types.full_hash = <spack.hash_types.SpecHashDescriptor object>

Full hash used in build pipelines to determine when to rebuild packages.

spack.hash_types.package_hash = <spack.hash_types.SpecHashDescriptor object>

Package hash used as part of full hash

spack.hash_types.process_hash = <spack.hash_types.SpecHashDescriptor object>

Hash descriptor used only to transfer a DAG, as is, across processes

spack.install_test module

exception spack.install_test.TestFailure(failures)[source]

Bases: spack.error.SpackError

Raised when package tests have failed for an installation.

class spack.install_test.TestSuite(specs, alias=None)[source]

Bases: object

add_failure(exc, msg)[source]
property content_hash
property current_test_cache_dir
property current_test_data_dir
ensure_stage()[source]
static from_dict(d)[source]
static from_file(filename)[source]
log_file_for_spec(spec)[source]
property name
property results_file
property stage
test_dir_for_spec(spec)[source]
classmethod test_log_name(spec)[source]
classmethod test_pkg_id(spec)[source]

Build the standard install test package identifier

Args: spec (Spec): instance of the spec under test

Returns: (str): the install test package identifier

tested_file_for_spec(spec)[source]
classmethod tested_file_name(spec)[source]
to_dict()[source]
write_reproducibility_data()[source]
write_test_result(spec, result)[source]
exception spack.install_test.TestSuiteFailure(num_failures)[source]

Bases: spack.error.SpackError

Raised when one or more tests in a suite have failed.

exception spack.install_test.TestSuiteNameError(message, long_message=None)[source]

Bases: spack.error.SpackError

Raised when there is an issue with the naming of the test suite.

exception spack.install_test.TestSuiteSpecError(message, long_message=None)[source]

Bases: spack.error.SpackError

Raised when there is an issue associated with the spec being tested.

spack.install_test.get_all_test_suites()[source]
spack.install_test.get_escaped_text_output(filename)[source]

Retrieve and escape the expected text output from the file

Parameters

filename (str) – path to the file

Returns

escaped text lines read from the file

Return type

list

spack.install_test.get_named_test_suites(name)[source]

Return a list of the names of any test suites with that name.

spack.install_test.get_test_stage_dir()[source]
spack.install_test.get_test_suite(name)[source]
spack.install_test.write_test_suite_file(suite)[source]

Write the test suite to its lock file.

spack.installer module

This module encapsulates package installation functionality.

The PackageInstaller coordinates concurrent builds of packages for the same Spack instance by leveraging the dependency DAG and file system locks. It also proceeds with the installation of non-dependent packages of failed dependencies in order to install as many dependencies of a package as possible.

Bottom-up traversal of the dependency DAG while prioritizing packages with no uninstalled dependencies allows multiple processes to perform concurrent builds of separate packages associated with a spec.

File system locks enable coordination such that no two processes attempt to build the same or a failed dependency package.

Failures to install dependency packages result in removal of their dependents’ build tasks from the current process. A failure file is also written (and locked) so that other processes can detect the failure and adjust their build tasks accordingly.

This module supports the coordination of local and distributed concurrent installations of packages in a Spack instance.

exception spack.installer.BadInstallPhase(pkg_name, phase)[source]

Bases: spack.installer.InstallError

Raised for an install phase option is not allowed for a package.

class spack.installer.BuildProcessInstaller(pkg, install_args)[source]

Bases: object

This class implements the part installation that happens in the child process.

run()[source]

Main entry point from build_process to kick off install in child.

class spack.installer.BuildRequest(pkg, install_args)[source]

Bases: object

Class for representing an installation request.

get_deptypes(pkg)[source]

Determine the required dependency types for the associated package.

Parameters

pkg (spack.package.PackageBase) – explicit or implicit package being installed

Returns

required dependency type(s) for the package

Return type

tuple

has_dependency(dep_id)[source]

Returns True if the package id represents a known dependency of the requested package, False otherwise.

run_tests(pkg)[source]

Determine if the tests should be run for the provided packages

Parameters

pkg (spack.package.PackageBase) – explicit or implicit package being installed

Returns

True if they should be run; False otherwise

Return type

bool

property spec

The specification associated with the package.

traverse_dependencies()[source]

Yield any dependencies of the appropriate type(s)

Yields

(Spec) The next child spec in the DAG

class spack.installer.BuildTask(pkg, request, compiler, start, attempts, status, installed)[source]

Bases: object

Class for representing the build task for a package.

add_dependent(pkg_id)[source]

Ensure the dependent package id is in the task’s list so it will be properly updated when this package is installed.

Parameters

pkg_id (str) – package identifier of the dependent package

property explicit

The package was explicitly requested by the user.

flag_installed(installed)[source]

Ensure the dependency is not considered to still be uninstalled.

Parameters

installed (list) – the identifiers of packages that have been installed so far

property key

The key is the tuple (# uninstalled dependencies, sequence).

next_attempt(installed)[source]

Create a new, updated task for the next installation attempt.

property priority

The priority is based on the remaining uninstalled dependencies.

exception spack.installer.ExternalPackageError(message, long_msg=None)[source]

Bases: spack.installer.InstallError

Raised by install() when a package is only for external use.

class spack.installer.InstallAction[source]

Bases: object

INSTALL = 1

Do a standard install

NONE = 0

Don’t perform an install

OVERWRITE = 2

Do an overwrite install

exception spack.installer.InstallError(message, long_msg=None)[source]

Bases: spack.error.SpackError

Raised when something goes wrong during install or uninstall.

exception spack.installer.InstallLockError(message, long_msg=None)[source]

Bases: spack.installer.InstallError

Raised during install when something goes wrong with package locking.

class spack.installer.OverwriteInstall(installer, database, task, tmp_root=None)[source]

Bases: object

install()[source]

Try to run the install task overwriting the package prefix. If this fails, try to recover the original install prefix. If that fails too, mark the spec as uninstalled. This function always the original install error if installation fails.

class spack.installer.PackageInstaller(installs=[])[source]

Bases: object

Class for managing the install process for a Spack instance based on a bottom-up DAG approach.

This installer can coordinate concurrent batch and interactive, local and distributed (on a shared file system) builds for the same Spack instance.

install()[source]

Install the requested package(s) and or associated dependencies.

Parameters

pkg (spack.package.Package) – the package to be built and installed

spack.installer.STATUS_ADDED = 'queued'

Build status indicating task has been added.

spack.installer.STATUS_DEQUEUED = 'dequeued'

Build status indicating the task has been popped from the queue

spack.installer.STATUS_FAILED = 'failed'

Build status indicating the spec failed to install

spack.installer.STATUS_INSTALLED = 'installed'

Build status indicating the spec was sucessfully installed

spack.installer.STATUS_INSTALLING = 'installing'

Build status indicating the spec is being installed (possibly by another process)

spack.installer.STATUS_REMOVED = 'removed'

Build status indicating task has been removed (to maintain priority queue invariants).

class spack.installer.TermTitle(pkg_count)[source]

Bases: object

next_pkg()[source]
set(text)[source]
exception spack.installer.UpstreamPackageError(message, long_msg=None)[source]

Bases: spack.installer.InstallError

Raised during install when something goes wrong with an upstream package.

spack.installer.build_process(pkg, install_args)[source]

Perform the installation/build of the package.

This runs in a separate child process, and has its own process and python module space set up by build_environment.start_build_process().

This essentially wraps an instance of BuildProcessInstaller so that we can more easily create one in a subprocess.

This function’s return value is returned to the parent process.

Parameters
spack.installer.clear_failures()[source]

Remove all failure tracking markers for the Spack instance.

spack.installer.combine_phase_logs(phase_log_files, log_path)[source]

Read set or list of logs and combine them into one file.

Each phase will produce it’s own log, so this function aims to cat all the separate phase log output files into the pkg.log_path. It is written generally to accept some list of files, and a log path to combine them to.

Parameters
  • phase_log_files (list) – a list or iterator of logs to combine

  • log_path (str) – the path to combine them to

spack.installer.dump_packages(spec, path)[source]

Dump all package information for a spec and its dependencies.

This creates a package repository within path for every namespace in the spec DAG, and fills the repos with package files and patch files for every node in the DAG.

Parameters
  • spec (spack.spec.Spec) – the Spack spec whose package information is to be dumped

  • path (str) – the path to the build packages directory

spack.installer.get_dependent_ids(spec)[source]

Return a list of package ids for the spec’s dependents

Parameters

spec (spack.spec.Spec) – Concretized spec

Returns

list of package ids

Return type

list

spack.installer.install_msg(name, pid)[source]

Colorize the name/id of the package being installed

Parameters
  • name (str) – Name/id of the package being installed

  • pid (int) – id of the installer process

Returns

Colorized installing message

Return type

str

spack.installer.log(pkg)[source]

Copy provenance into the install directory on success

Parameters

pkg (spack.package.Package) – the package that was built and installed

spack.installer.package_id(pkg)[source]

A “unique” package identifier for installation purposes

The identifier is used to track build tasks, locks, install, and failure statuses.

The identifier needs to distinguish between combinations of compilers and packages for combinatorial environments.

Parameters

pkg (spack.package.PackageBase) – the package from which the identifier is derived

spack.main module

This is the implementation of the Spack command line executable.

In a normal Spack installation, this is invoked from the bin/spack script after the system path is set up.

class spack.main.SpackArgumentParser(prog=None, usage=None, description=None, epilog=None, parents=[], formatter_class=<class 'argparse.HelpFormatter'>, prefix_chars='-', fromfile_prefix_chars=None, argument_default=None, conflict_handler='error', add_help=True, allow_abbrev=True)[source]

Bases: argparse.ArgumentParser

add_command(cmd_name)[source]

Add one subcommand to this parser.

add_subparsers(**kwargs)[source]

Ensure that sensible defaults are propagated to subparsers

format_help(level='short')[source]
format_help_sections(level)[source]

Format help on sections for a particular verbosity level.

Parameters

level (str) – ‘short’ or ‘long’ (more commands shown for long)

class spack.main.SpackCommand(command_name)[source]

Bases: object

Callable object that invokes a spack command (for testing).

Example usage:

install = SpackCommand('install')
install('-v', 'mpich')

Use this to invoke Spack commands directly from Python and check their output.

exception spack.main.SpackCommandError[source]

Bases: Exception

Raised when SpackCommand execution fails.

class spack.main.SpackHelpFormatter(prog, indent_increment=2, max_help_position=24, width=None)[source]

Bases: argparse.RawTextHelpFormatter

add_arguments(actions)[source]
spack.main.add_all_commands(parser)[source]

Add all spack subcommands to the parser.

spack.main.aliases = {'rm': 'remove'}

top-level aliases for Spack commands

spack.main.allows_unknown_args(command)[source]

Implements really simple argument injection for unknown arguments.

Commands may add an optional argument called “unknown args” to indicate they can handle unknonwn args, and we’ll pass the unknown args in.

spack.main.get_version()[source]

Get a descriptive version of this instance of Spack.

If this is a git repository, and if it is not on a release tag, return a string like:

release_version-commits_since_release-commit

If we are at a release tag, or if this is not a git repo, return the real spack release number (e.g., 0.13.3).

spack.main.index_commands()[source]

create an index of commands by section for this help level

spack.main.intro_by_level = {'long': 'Complete list of spack commands:', 'short': 'These are common spack commands:'}

intro text for help at different levels

spack.main.levels = ['short', 'long']

help levels in order of detail (i.e., number of commands shown)

spack.main.main(argv=None)[source]

This is the entry point for the Spack command.

main() itself is just an error handler – it handles errors for everything in Spack that makes it to the top level.

The logic is all in _main().

Parameters

argv (list or None) – command line arguments, NOT including the executable name. If None, parses from sys.argv.

spack.main.make_argument_parser(**kwargs)[source]

Create an basic argument parser without any subcommands added.

spack.main.options_by_level = {'long': 'all', 'short': ['h', 'k', 'V', 'color']}

control top-level spack options shown in basic vs. advanced help

spack.main.print_setup_info(*info)[source]

Print basic information needed by setup-env.[c]sh.

Parameters

info (list) – list of things to print: comma-separated list of ‘csh’, ‘sh’, or ‘modules’

This is in main.py to make it fast; the setup scripts need to invoke spack in login scripts, and it needs to be quick.

spack.main.required_command_properties = ['level', 'section', 'description']

Properties that commands are required to set.

spack.main.section_descriptions = {'admin': 'administration', 'basic': 'query packages', 'build': 'build packages', 'config': 'configuration', 'developer': 'developer', 'environment': 'environment', 'extensions': 'extensions', 'help': 'more help', 'packaging': 'create packages', 'system': 'system'}

Longer text for each section, to show in help

spack.main.section_order = {'basic': ['list', 'info', 'find'], 'build': ['fetch', 'stage', 'patch', 'configure', 'build', 'restage', 'install', 'uninstall', 'clean'], 'packaging': ['create', 'edit']}

preferential command order for some sections (e.g., build pipeline is in execution order, not alphabetical)

spack.main.send_warning_to_tty(message, *args)[source]

Redirects messages to tty.warn.

spack.main.set_working_dir()[source]

Change the working directory to getcwd, or spack prefix if no cwd.

spack.main.setup_main_options(args)[source]

Configure spack globals based on the basic options.

spack.main.spack_working_dir = None

Recorded directory where spack command was originally invoked

spack.main.stat_names = {'calls': (((1, -1),), 'call count'), 'cumtime': (((3, -1),), 'cumulative time'), 'cumulative': (((3, -1),), 'cumulative time'), 'filename': (((4, 1),), 'file name'), 'line': (((5, 1),), 'line number'), 'module': (((4, 1),), 'file name'), 'name': (((6, 1),), 'function name'), 'ncalls': (((1, -1),), 'call count'), 'nfl': (((6, 1), (4, 1), (5, 1)), 'name/file/line'), 'pcalls': (((0, -1),), 'primitive call count'), 'stdname': (((7, 1),), 'standard name'), 'time': (((2, -1),), 'internal time'), 'tottime': (((2, -1),), 'internal time')}

names of profile statistics

spack.mirror module

This file contains code for creating spack mirror directories. A mirror is an organized hierarchy containing specially named archive files. This enabled spack to know where to find files in a mirror if the main server for a particular package is down. Or, if the computer where spack is run is not connected to the internet, it allows spack to download packages directly from a mirror (e.g., on an intranet).

class spack.mirror.Mirror(fetch_url, push_url=None, name=None)[source]

Bases: object

Represents a named location for storing source tarballs and binary packages.

Mirrors have a fetch_url that indicate where and how artifacts are fetched from them, and a push_url that indicate where and how artifacts are pushed to them. These two URLs are usually the same.

display(max_len=0)[source]
property fetch_url
static from_dict(d, name=None)[source]
static from_json(stream, name=None)[source]
static from_yaml(stream, name=None)[source]
get_access_pair(url_type)[source]
get_access_token(url_type)[source]
get_endpoint_url(url_type)[source]
get_profile(url_type)[source]
property name
property push_url
set_access_pair(url_type, connection_tuple)[source]
set_access_token(url_type, connection_token)[source]
set_endpoint_url(url_type, url)[source]
set_profile(url_type, profile)[source]
to_dict()[source]
to_json(stream=None)[source]
to_yaml(stream=None)[source]
class spack.mirror.MirrorCollection(mirrors=None, scope=None)[source]

Bases: collections.abc.Mapping

A mapping of mirror names to mirrors.

display()[source]
static from_dict(d)[source]
static from_json(stream, name=None)[source]
static from_yaml(stream, name=None)[source]
lookup(name_or_url)[source]

Looks up and returns a Mirror.

If this MirrorCollection contains a named Mirror under the name [name_or_url], then that mirror is returned. Otherwise, [name_or_url] is assumed to be a mirror URL, and an anonymous mirror with the given URL is returned.

to_dict(recursive=False)[source]
to_json(stream=None)[source]
to_yaml(stream=None)[source]
exception spack.mirror.MirrorError(msg, long_msg=None)[source]

Bases: spack.error.SpackError

Superclass of all mirror-creation related errors.

class spack.mirror.MirrorReference(cosmetic_path, global_path=None)[source]

Bases: object

A MirrorReference stores the relative paths where you can store a package/resource in a mirror directory.

The appropriate storage location is given by storage_path. The cosmetic_path property provides a reference that a human could generate themselves based on reading the details of the package.

A user can iterate over a MirrorReference object to get all the possible names that might be used to refer to the resource in a mirror; this includes names generated by previous naming schemes that are no-longer reported by storage_path or cosmetic_path.

property storage_path
class spack.mirror.MirrorStats[source]

Bases: object

added(resource)[source]
already_existed(resource)[source]
error()[source]
next_spec(spec)[source]
stats()[source]
spack.mirror.add(name, url, scope, args={})[source]

Add a named mirror in the given scope

spack.mirror.create(path, specs, skip_unstable_versions=False)[source]

Create a directory to be used as a spack mirror, and fill it with package archives.

Parameters
  • path – Path to create a mirror directory hierarchy in.

  • specs – Any package versions matching these specs will be added to the mirror.

  • skip_unstable_versions – if true, this skips adding resources when they do not have a stable archive checksum (as determined by fetch_strategy.stable_target)

Return Value:

Returns a tuple of lists: (present, mirrored, error)

  • present: Package specs that were already present.

  • mirrored: Package specs that were successfully mirrored.

  • error: Package specs that failed to mirror due to some error.

This routine iterates through all known package versions, and it creates specs for those versions. If the version satisfies any spec in the specs list, it is downloaded and added to the mirror.

spack.mirror.get_all_versions(specs)[source]

Given a set of initial specs, return a new set of specs that includes each version of each package in the original set.

Note that if any spec in the original set specifies properties other than version, this information will be omitted in the new set; for example; the new set of specs will not include variant settings.

spack.mirror.get_matching_versions(specs, num_versions=1)[source]

Get a spec for EACH known version matching any spec in the list. For concrete specs, this retrieves the concrete version and, if more than one version per spec is requested, retrieves the latest versions of the package.

spack.mirror.mirror_archive_paths(fetcher, per_package_ref, spec=None)[source]

Returns a MirrorReference object which keeps track of the relative storage path of the resource associated with the specified fetcher.

spack.mirror.remove(name, scope)[source]

Remove the named mirror in the given scope

spack.mixins module

This module contains additional behavior that can be attached to any given package.

class spack.mixins.PackageMixinsMeta(name, bases, attr_dict)[source]

Bases: type

This metaclass serves the purpose of implementing a declarative syntax for package mixins.

Mixins are implemented below in the form of a function. Each one of them needs to register a callable that takes a single argument to be run before or after a certain phase. This callable is basically a method that gets implicitly attached to the package class by calling the mixin.

static register_method_after(fn, phase)[source]

Registers a method to be run after a certain phase.

Parameters
  • fn – function taking a single argument (self)

  • phase (str) – phase after which fn must run

static register_method_before(fn, phase)[source]

Registers a method to be run before a certain phase.

Parameters
  • fn – function taking a single argument (self)

  • phase (str) – phase before which fn must run

spack.mixins.filter_compiler_wrappers(*files, **kwargs)[source]

Substitutes any path referring to a Spack compiler wrapper with the path of the underlying compiler that has been used.

If this isn’t done, the files will have CC, CXX, F77, and FC set to Spack’s generic cc, c++, f77, and f90. We want them to be bound to whatever compiler they were built with.

Parameters
  • *files – files to be filtered relative to the search root (which is, by default, the installation prefix)

  • **kwargs

    allowed keyword arguments

    after

    specifies after which phase the files should be filtered (defaults to ‘install’)

    relative_root

    path relative to prefix where to start searching for the files to be filtered. If not set the install prefix wil be used as the search root. It is highly recommended to set this, as searching from the installation prefix may affect performance severely in some cases.

    ignore_absent, backup

    these two keyword arguments, if present, will be forwarded to filter_file (see its documentation for more information on their behavior)

    recursive

    this keyword argument, if present, will be forwarded to find (see its documentation for more information on the behavior)

spack.monitor module

Interact with a Spack Monitor Service. Derived from https://github.com/spack/spack-monitor/blob/main/script/spackmoncli.py

class spack.monitor.SpackMonitorClient(host=None, prefix='ms1', allow_fail=False, tags=None, save_local=False)[source]

Bases: object

Client to interact with a spack monitor server.

We require the host url, along with the prefix to discover the service_info endpoint. If allow_fail is set to True, we will not exit on error with tty.die given that a request is not successful. The spack version is one of the fields to uniquely identify a spec, so we add it to the client on init.

authenticate_request(originalResponse)[source]

Authenticate the request.

Given a response (an HTTPError 401), look for a Www-Authenticate header to parse. We return True/False to indicate if the request should be retried.

capture_build_environment()[source]

Capture the environment for the build.

This uses spack.util.environment.get_host_environment_metadata to do so. This is important because it’s a unique identifier, along with the spec, for a Build. It should look something like this:

{‘host_os’: ‘ubuntu20.04’,

‘platform’: ‘linux’, ‘host_target’: ‘skylake’, ‘hostname’: ‘vanessa-ThinkPad-T490s’, ‘spack_version’: ‘0.16.1-1455-52d5b55b65’, ‘kernel_version’: ‘#73-Ubuntu SMP Mon Jan 18 17:25:17 UTC 2021’}

This is saved to a package install’s metadata folder as install_environment.json, and can be loaded by the monitor for uploading data relevant to a later analysis.

do_request(endpoint, data=None, headers=None, url=None)[source]

Do the actual request.

If data is provided, it is POST, otherwise GET. If an entire URL is provided, don’t use the endpoint

fail_task(spec)[source]

Given a spec, mark it as failed. This means that Spack Monitor marks all dependencies as cancelled, unless they are already successful

get_build_id(spec, return_response=False, spec_exists=True)[source]

Retrieve a build id, either in the local cache, or query the server.

get_local_build_id(data, full_hash, return_response)[source]

Generate a local build id based on hashing the expected data

get_server_build_id(data, full_hash, return_response=False)[source]

Retrieve a build id from the spack monitor server

issue_request(request, retry=True)[source]

Given a prepared request, issue it.

If we get an error, die. If there are times when we don’t want to exit on error (but instead disable using the monitoring service) we could add that here.

iter_read(pattern)[source]

A helper to read json from a directory glob and return it loaded.

load_build_environment(spec)[source]

Load a build environment from install_environment.json.

If we are running an analyze command, we will need to load previously used build environment metadata from install_environment.json to capture what was done during the build.

new_build(spec)[source]

Create a new build.

This means sending the hash of the spec to be built, along with the build environment. These two sets of data uniquely can identify the build, and we will add objects (the binaries produced) to it. We return the build id to the calling client.

new_configuration(specs)[source]

Given a list of specs, generate a new configuration for each.

We return a lookup of specs with their package names. This assumes that we are only installing one version of each package. We aren’t starting or creating any builds, so we don’t need a build environment.

prepare_request(endpoint, data, headers)[source]

Prepare a request given an endpoint, data, and headers.

If data is provided, urllib makes the request a POST

require_auth()[source]

Require authentication.

The token and username must not be unset

reset()[source]

Reset and prepare for a new request.

save(obj, filename)[source]

Save a monitor json result to the save directory.

send_analyze_metadata(pkg, metadata)[source]

Send spack analyzer metadata to the spack monitor server.

Given a dictionary of analyzers (with key as analyzer type, and value as the data) upload the analyzer output to Spack Monitor. Spack Monitor should either have a known understanding of the analyzer, or if not (the key is not recognized), it’s assumed to be a dictionary of objects/files, each with attributes to be updated. E.g.,

{“analyzer-name”: {“object-file-path”: {“feature1”: “value1”}}}

send_phase(pkg, phase_name, phase_output_file, status)[source]

Send the result of a phase during install.

Given a package, phase name, and status, update the monitor endpoint to alert of the status of the stage. This includes parsing the package metadata folder for phase output and error files

service_info()[source]

Get the service information endpoint

set_basic_auth(username, password)[source]

A wrapper to adding basic authentication to the Request

set_header(name, value)[source]
setup_save()[source]

Given a local save “save_local” ensure the output directory exists.

update_build(spec, status='SUCCESS')[source]

Update a build with a new status.

This typically updates the relevant package to indicate a successful install. This endpoint can take a general status to update.

upload_local_save(dirname)[source]

Upload results from a locally saved directory to spack monitor.

The general workflow will first include an install with save local: spack install –monitor –monitor-save-local And then a request to upload the root or specific directory. spack upload monitor ~/.spack/reports/monitor/<date>/

upload_specfile(filename)[source]

Upload a spec file to the spack monitor server.

Given a spec file (must be json) upload to the UploadSpec endpoint. This function is not used in the spack to server workflow, but could be useful is Spack Monitor is intended to send an already generated file in some kind of separate analysis. For the environment file, we parse out SPACK_* variables to include.

class spack.monitor.authHeader(lookup)[source]

Bases: object

spack.monitor.get_client(host, prefix='ms1', disable_auth=False, allow_fail=False, tags=None, save_local=False)[source]

Get a monitor client for a particular host and prefix.

If the client is not running, we exit early, unless allow_fail is set to true, indicating that we should continue the build even if the server is not present. Note that this client is defined globally as “cli” so we can istantiate it once (checking for credentials, etc.) and then always have access to it via spack.monitor.cli. Also note that typically, we call the monitor by way of hooks in spack.hooks.monitor. So if you want the monitor to have a new interaction with some part of the codebase, it’s recommended to write a hook first, and then have the monitor use it.

spack.monitor.get_monitor_group(subparser)[source]

Retrieve the monitor group for the argument parser.

Since the monitor group is shared between commands, we provide a common function to generate the group for it. The user can pass the subparser, and the group is added, and returned.

spack.monitor.parse_auth_header(authHeaderRaw)[source]

Parse an authentication header into relevant pieces

spack.monitor.read_file(filename)[source]

Read a file, if it exists. Otherwise return None

spack.monitor.read_json(filename)[source]

Read a file and load into json, if it exists. Otherwise return None.

spack.monitor.write_file(content, filename)[source]

Write content to file

spack.monitor.write_json(obj, filename)[source]

Write a json file, if the output directory exists.

spack.multimethod module

This module contains utilities for using multi-methods in spack. You can think of multi-methods like overloaded methods – they’re methods with the same name, and we need to select a version of the method based on some criteria. e.g., for overloaded methods, you would select a version of the method to call based on the types of its arguments.

In spack, multi-methods are used to ease the life of package authors. They allow methods like install() (or other methods called by install()) to declare multiple versions to be called when the package is instantiated with different specs. e.g., if the package is built with OpenMPI on x86_64,, you might want to call a different install method than if it was built for mpich2 on BlueGene/Q. Likewise, you might want to do a different type of install for different versions of the package.

Multi-methods provide a simple decorator-based syntax for this that avoids overly complicated rat nests of if statements. Obviously, depending on the scenario, regular old conditionals might be clearer, so package authors should use their judgement.

exception spack.multimethod.MultiMethodError(message)[source]

Bases: spack.error.SpackError

Superclass for multimethod dispatch errors

class spack.multimethod.MultiMethodMeta(name, bases, attr_dict)[source]

Bases: type

This allows us to track the class’s dict during instantiation.

exception spack.multimethod.NoSuchMethodError(cls, method_name, spec, possible_specs)[source]

Bases: spack.error.SpackError

Raised when we can’t find a version of a multi-method.

class spack.multimethod.SpecMultiMethod(default=None)[source]

Bases: object

This implements a multi-method for Spack specs. Packages are instantiated with a particular spec, and you may want to execute different versions of methods based on what the spec looks like. For example, you might want to call a different version of install() for one platform than you call on another.

The SpecMultiMethod class implements a callable object that handles method dispatch. When it is called, it looks through registered methods and their associated specs, and it tries to find one that matches the package’s spec. If it finds one (and only one), it will call that method.

This is intended for use with decorators (see below). The decorator (see docs below) creates SpecMultiMethods and registers method versions with them.

To register a method, you can do something like this:

mm = SpecMultiMethod() mm.register(“^chaos_5_x86_64_ib”, some_method)

The object registered needs to be a Spec or some string that will parse to be a valid spec.

When the mm is actually called, it selects a version of the method to call based on the sys_type of the object it is called on.

See the docs for decorators below for more details.

register(spec, method)[source]

Register a version of a method for a particular spec.

class spack.multimethod.when(condition)[source]

Bases: object

spack.package module

This is where most of the action happens in Spack.

The spack package class structure is based strongly on Homebrew (http://brew.sh/), mainly because Homebrew makes it very easy to create packages.

exception spack.package.ActivationError(msg, long_msg=None)[source]

Bases: spack.package.ExtensionError

Raised when there are problems activating an extension.

class spack.package.BundlePackage(spec)[source]

Bases: spack.package.PackageBase

General purpose bundle, or no-code, package class.

build_system_class = 'BundlePackage'

This attribute is used in UI queries that require to know which build-system class we are using

has_code = False

Bundle packages do not have associated source or binary code.

phases = []

There are no phases by default but the property is required to support post-install hooks (e.g., for module generation).

exception spack.package.DependencyConflictError(conflict)[source]

Bases: spack.error.SpackError

Raised when the dependencies cannot be flattened as asked for.

class spack.package.DetectablePackageMeta(name, bases, attr_dict)[source]

Bases: object

Check if a package is detectable and add default implementations for the detection function.

exception spack.package.ExtensionError(message, long_msg=None)[source]

Bases: spack.package.PackageError

Superclass for all errors having to do with extension packages.

exception spack.package.FetchError(message, long_msg=None)[source]

Bases: spack.error.SpackError

Raised when something goes wrong during fetch.

class spack.package.InstallPhase(name)[source]

Bases: object

Manages a single phase of the installation.

This descriptor stores at creation time the name of the method it should search for execution. The method is retrieved at __get__ time, so that it can be overridden by subclasses of whatever class declared the phases.

It also provides hooks to execute arbitrary callbacks before and after the phase.

copy()[source]
exception spack.package.InvalidPackageOpError(message, long_msg=None)[source]

Bases: spack.package.PackageError

Raised when someone tries perform an invalid operation on a package.

exception spack.package.NoURLError(cls)[source]

Bases: spack.package.PackageError

Raised when someone tries to build a URL for a package with no URLs.

class spack.package.Package(spec)[source]

Bases: spack.package.PackageBase

General purpose class with a single install phase that needs to be coded by packagers.

build_system_class = 'Package'

This attribute is used in UI queries that require to know which build-system class we are using

phases = ['install']

The one and only phase

class spack.package.PackageBase(spec)[source]

Bases: spack.package.PackageViewMixin, object

This is the superclass for all spack packages.

*The Package class*

At its core, a package consists of a set of software to be installed. A package may focus on a piece of software and its associated software dependencies or it may simply be a set, or bundle, of software. The former requires defining how to fetch, verify (via, e.g., sha256), build, and install that software and the packages it depends on, so that dependencies can be installed along with the package itself. The latter, sometimes referred to as a no-source package, requires only defining the packages to be built.

Packages are written in pure Python.

There are two main parts of a Spack package:

  1. The package class. Classes contain directives, which are special functions, that add metadata (versions, patches, dependencies, and other information) to packages (see directives.py). Directives provide the constraints that are used as input to the concretizer.

  2. Package instances. Once instantiated, a package is essentially a software installer. Spack calls methods like do_install() on the Package object, and it uses those to drive user-implemented methods like patch(), install(), and other build steps. To install software, an instantiated package needs a concrete spec, which guides the behavior of the various install methods.

Packages are imported from repos (see repo.py).

Package DSL

Look in lib/spack/docs or check https://spack.readthedocs.io for the full documentation of the package domain-specific language. That used to be partially documented here, but as it grew, the docs here became increasingly out of date.

Package Lifecycle

A package’s lifecycle over a run of Spack looks something like this:

p = Package()             # Done for you by spack

p.do_fetch()              # downloads tarball from a URL (or VCS)
p.do_stage()              # expands tarball in a temp directory
p.do_patch()              # applies patches to expanded source
p.do_install()            # calls package's install() function
p.do_uninstall()          # removes install directory

although packages that do not have code have nothing to fetch so omit p.do_fetch().

There are also some other commands that clean the build area:

p.do_clean()              # removes the stage directory entirely
p.do_restage()            # removes the build directory and
                          # re-expands the archive.

The convention used here is that a do_* function is intended to be called internally by Spack commands (in spack.cmd). These aren’t for package writers to override, and doing so may break the functionality of the Package class.

Package creators have a lot of freedom, and they could technically override anything in this class. That is not usually required.

For most use cases. Package creators typically just add attributes like homepage and, for a code-based package, url, or functions such as install(). There are many custom Package subclasses in the spack.build_systems package that make things even easier for specific build systems.

activate(extension, view, **kwargs)[source]

Add the extension to the specified view.

Package authors can override this function to maintain some centralized state related to the set of activated extensions for a package.

Spack internals (commands, hooks, etc.) should call do_activate() method so that proper checks are always executed.

classmethod all_patches()[source]

Retrieve all patches associated with the package.

Retrieves patches on the package itself as well as patches on the dependencies of the package.

property all_urls

A list of all URLs in a package.

Check both class-level and version-specific URLs.

Returns

a list of URLs

Return type

list

apply_macos_rpath_fixups()[source]

On Darwin, make installed libraries more easily relocatable.

Some build systems (handrolled, autotools, makefiles) can set their own rpaths that are duplicated by spack’s compiler wrapper. This fixup interrogates, and postprocesses if necessary, all libraries installed by the code.

It should be added as a @run_after to packaging systems (or individual packages) that do not install relocatable libraries by default.

archive_files = []

List of glob expressions. Each expression must either be absolute or relative to the package source path. Matching artifacts found at the end of the build process will be copied in the same directory tree as _spack_build_logfile and _spack_build_envfile.

property build_log_path

Return the expected (or current) build log file path. The path points to the staging build file until the software is successfully installed, when it points to the file in the installation directory.

classmethod build_system_flags(name, flags)[source]

flag_handler that passes flags to the build system arguments. Any package using build_system_flags must also implement flags_to_build_system_args, or derive from a class that implements it. Currently, AutotoolsPackage and CMakePackage implement it.

build_time_test_callbacks = None

A list or set of build time test functions to be called when tests are executed or ‘None’ if there are no such test functions.

cache_extra_test_sources(srcs)[source]

Copy relative source paths to the corresponding install test subdir

This method is intended as an optional install test setup helper for grabbing source files/directories during the installation process and copying them to the installation test subdirectory for subsequent use during install testing.

Parameters

srcs (str or list) – relative path for files and or subdirectories located in the staged source path that are to be copied to the corresponding location(s) under the install testing directory.

property compiler

Get the spack.compiler.Compiler object used to build this package

property configure_args_path

Return the configure args file path associated with staging.

content_hash(content=None)[source]

Create a hash based on the sources and logic used to build the package. This includes the contents of all applied patches and the contents of applicable functions in the package subclass.

deactivate(extension, view, **kwargs)[source]

Remove all extension files from the specified view.

Package authors can override this method to support other extension mechanisms. Spack internals (commands, hooks, etc.) should call do_deactivate() method so that proper checks are always executed.

dependencies_of_type(*deptypes)[source]

Get dependencies that can possibly have these deptypes.

This analyzes the package and determines which dependencies can be a certain kind of dependency. Note that they may not always be this kind of dependency, since dependencies can be optional, so something may be a build dependency in one configuration and a run dependency in another.

dependency_activations()[source]
do_activate(view=None, with_dependencies=True, verbose=True)[source]

Call