Contribution Guide

This guide is intended for developers or administrators who want to contribute a new package, feature, or bugfix to Spack. It assumes that you have at least some familiarity with Git VCS and Github. The guide will show a few examples of contributing workflows and discuss the granularity of pull-requests (PRs). It will also discuss the tests your PR must pass in order to be accepted into Spack.

First, what is a PR? Quoting Bitbucket's tutorials:

Pull requests are a mechanism for a developer to notify team members that they have completed a feature. The pull request is more than just a notification—it’s a dedicated forum for discussing the proposed feature.

Important is completed feature. The changes one proposes in a PR should correspond to one feature/bugfix/extension/etc. One can create PRs with changes relevant to different ideas, however reviewing such PRs becomes tedious and error prone. If possible, try to follow the one-PR-one-package/feature rule.

Spack uses a rough approximation of the Git Flow branching model. The develop branch contains the latest contributions, and master is always tagged and points to the latest stable release. Therefore, when you send your request, make develop the destination branch on the Spack repository.

Continuous Integration

Spack uses Travis CI for Continuous Integration testing. This means that every time you submit a pull request, a series of tests will be run to make sure you didn't accidentally introduce any bugs into Spack. Your PR will not be accepted until it passes all of these tests. While you can certainly wait for the results of these tests after submitting a PR, we recommend that you run them locally to speed up the review process.

注釈

Oftentimes, Travis will fail for reasons other than a problem with your PR. For example, apt-get, pip, or homebrew will fail to download one of the dependencies for the test suite, or a transient bug will cause the unit tests to timeout. If Travis fails, click the "Details" link and click on the test(s) that is failing. If it doesn't look like it is failing for reasons related to your PR, you have two options. If you have write permissions for the Spack repository, you should see a "Restart job" button on the right-hand side. If not, you can close and reopen your PR to rerun all of the tests. If the same test keeps failing, there may be a problem with your PR. If you notice that every recent PR is failing with the same error message, it may be that Travis is down or one of Spack's dependencies put out a new release that is causing problems. If this is the case, please file an issue.

If you take a look in $SPACK_ROOT/.travis.yml, you'll notice that we test against Python 2.6, 2.7, and 3.4-3.7 on both macOS and Linux. We currently perform 3 types of tests:

Unit Tests

Unit tests ensure that core Spack features like fetching or spec resolution are working as expected. If your PR only adds new packages or modifies existing ones, there's very little chance that your changes could cause the unit tests to fail. However, if you make changes to Spack's core libraries, you should run the unit tests to make sure you didn't break anything.

Since they test things like fetching from VCS repos, the unit tests require git, mercurial, and subversion to run. Make sure these are installed on your system and can be found in your PATH. All of these can be installed with Spack or with your system package manager.

To run all of the unit tests, use:

$ spack test

These tests may take several minutes to complete. If you know you are only modifying a single Spack feature, you can run a single unit test at a time:

$ spack test architecture

This allows you to develop iteratively: make a change, test that change, make another change, test that change, etc. To get a list of all available unit tests, run:

$ spack test --list
architecture               mirror          spec_dag          build_env     mirror             log
build_environment          module_parsing  spec_semantics    buildcache    module             common
build_system_guess         multimethod     spec_set          cd            print_shell_vars   dotkit
build_systems              namespace_trie  spec_syntax       clean         providers          lmod
cc                         optional_deps   spec_yaml         commands      python             tcl
cmd_extensions             package_hash    stage             config        release_jobs       editor
compilers                  package_sanity  svn_fetch         debug         resource           environment
concretize                 packages        tengine           dependencies  spec               executable
concretize_preferences     packaging       test_activations  dependents    test_compiler_cmd  file_cache
config                     patch           url_fetch         env           uninstall          log_parser
database                   pattern         url_parse         find          url                prefix
directory_layout           permissions     url_substitution  flake8        versions           spack_lock_wrapper
environment_modifications  provider_index  variant           gpg           view               spack_yaml
flag_handlers              python_version  versions          graph         arguments          util_string
git_fetch                  relocate        views             help          file_list
graph                      repo            web               info          filesystem
hg_fetch                   sbang           activate          install       lang
install                    schema          arch              license       link_tree
make_executable            spack_yaml      blame             list          lock

A more detailed list of available unit tests can be found by running spack test --long-list.

By default, pytest captures the output of all unit tests. If you add print statements to a unit test and want to see the output, simply run:

$ spack test -s -k architecture

Unit tests are crucial to making sure bugs aren't introduced into Spack. If you are modifying core Spack libraries or adding new functionality, please consider adding new unit tests or strengthening existing tests.

注釈

There is also a run-unit-tests script in share/spack/qa that runs the unit tests. Afterwards, it reports back to Codecov with the percentage of Spack that is covered by unit tests. This script is designed for Travis CI. If you want to run the unit tests yourself, we suggest you use spack test.

Flake8 Tests

Spack uses Flake8 to test for PEP 8 conformance. PEP 8 is a series of style guides for Python that provide suggestions for everything from variable naming to indentation. In order to limit the number of PRs that were mostly style changes, we decided to enforce PEP 8 conformance. Your PR needs to comply with PEP 8 in order to be accepted.

Testing for PEP 8 compliance is easy. Simply run the spack flake8 command:

$ spack flake8

spack flake8 has a couple advantages over running flake8 by hand:

  1. It only tests files that you have modified since branching off of develop.

  2. It works regardless of what directory you are in.

  3. It automatically adds approved exemptions from the flake8 checks. For example, URLs are often longer than 80 characters, so we exempt them from line length checks. We also exempt lines that start with "homepage", "url", "version", "variant", "depends_on", and "extends" in package.py files.

More approved flake8 exemptions can be found here.

If all is well, you'll see something like this:

$ run-flake8-tests
Dependencies found.
=======================================================
flake8: running flake8 code checks on spack.

Modified files:

  var/spack/repos/builtin/packages/hdf5/package.py
  var/spack/repos/builtin/packages/hdf/package.py
  var/spack/repos/builtin/packages/netcdf/package.py
=======================================================
Flake8 checks were clean.

However, if you aren't compliant with PEP 8, flake8 will complain:

var/spack/repos/builtin/packages/netcdf/package.py:26: [F401] 'os' imported but unused
var/spack/repos/builtin/packages/netcdf/package.py:61: [E303] too many blank lines (2)
var/spack/repos/builtin/packages/netcdf/package.py:106: [E501] line too long (92 > 79 characters)
Flake8 found errors.

Most of the error messages are straightforward, but if you don't understand what they mean, just ask questions about them when you submit your PR. The line numbers will change if you add or delete lines, so simply run spack flake8 again to update them.

ちなみに

Try fixing flake8 errors in reverse order. This eliminates the need for multiple runs of spack flake8 just to re-compute line numbers and makes it much easier to fix errors directly off of the Travis output.

警告

Flake8 and pep8-naming require a number of dependencies in order to run. If you installed py-flake8 and py-pep8-naming, the easiest way to ensure the right packages are on your PYTHONPATH is to run:

spack activate py-flake8
spack activate pep8-naming

so that all of the dependencies are symlinked to a central location. If you see an error message like:

Traceback (most recent call last):
  File: "/usr/bin/flake8", line 5, in <module>
    from pkg_resources import load_entry_point
ImportError: No module named pkg_resources

that means Flake8 couldn't find setuptools in your PYTHONPATH.

Documentation Tests

Spack uses Sphinx to build its documentation. In order to prevent things like broken links and missing imports, we added documentation tests that build the documentation and fail if there are any warning or error messages.

Building the documentation requires several dependencies, all of which can be installed with Spack:

  • sphinx

  • sphinxcontrib-programoutput

  • sphinx-rtd-theme

  • graphviz

  • git

  • mercurial

  • subversion

警告

Sphinx has several required dependencies. If you installed py-sphinx with Spack, make sure to add all of these dependencies to your PYTHONPATH. The easiest way to do this is to run:

$ spack activate py-sphinx
$ spack activate py-sphinx-rtd-theme
$ spack activate py-sphinxcontrib-programoutput

so that all of the dependencies are symlinked to a central location. If you see an error message like:

Extension error:
Could not import extension sphinxcontrib.programoutput (exception: No module named sphinxcontrib.programoutput)
make: *** [html] Error 1

that means Sphinx couldn't find py-sphinxcontrib-programoutput in your PYTHONPATH.

Once all of the dependencies are installed, you can try building the documentation:

$ cd "$SPACK_ROOT/lib/spack/docs"
$ make clean
$ make

If you see any warning or error messages, you will have to correct those before your PR is accepted.

注釈

There is also a run-doc-tests script in share/spack/qa. The only difference between running this script and running make by hand is that the script will exit immediately if it encounters an error or warning. This is necessary for Travis CI. If you made a lot of documentation changes, it is much quicker to run make by hand so that you can see all of the warnings at once.

If you are editing the documentation, you should obviously be running the documentation tests. But even if you are simply adding a new package, your changes could cause the documentation tests to fail:

package_list.rst:8745: WARNING: Block quote ends without a blank line; unexpected unindent.

At first, this error message will mean nothing to you, since you didn't edit that file. Until you look at line 8745 of the file in question:

Description:
   NetCDF is a set of software libraries and self-describing, machine-
  independent data formats that support the creation, access, and sharing
  of array-oriented scientific data.

Our documentation includes a list of all Spack packages. If you add a new package, its docstring is added to this page. The problem in this case was that the docstring looked like:

class Netcdf(Package):
    """
    NetCDF is a set of software libraries and self-describing,
    machine-independent data formats that support the creation,
    access, and sharing of array-oriented scientific data.
    """

Docstrings cannot start with a newline character, or else Sphinx will complain. Instead, they should look like:

class Netcdf(Package):
    """NetCDF is a set of software libraries and self-describing,
    machine-independent data formats that support the creation,
    access, and sharing of array-oriented scientific data."""

Documentation changes can result in much more obfuscated warning messages. If you don't understand what they mean, feel free to ask when you submit your PR.

Coverage

Spack uses Codecov to generate and report unit test coverage. This helps us tell what percentage of lines of code in Spack are covered by unit tests. Although code covered by unit tests can still contain bugs, it is much less error prone than code that is not covered by unit tests.

Codecov provides browser extensions for Google Chrome, Firefox, and Opera. These extensions integrate with GitHub and allow you to see coverage line-by-line when viewing the Spack repository. If you are new to Spack, a great way to get started is to write unit tests to increase coverage!

Unlike with Travis, Codecov tests are not required to pass in order for your PR to be merged. If you modify core Spack libraries, we would greatly appreciate unit tests that cover these changed lines. Otherwise, we have no way of knowing whether or not your changes introduce a bug. If you make substantial changes to the core, we may request unit tests to increase coverage.

注釈

If the only files you modified are package files, we do not care about coverage on your PR. You may notice that the Codecov tests fail even though you didn't modify any core files. This means that Spack's overall coverage has increased since you branched off of develop. This is a good thing! If you really want to get the Codecov tests to pass, you can rebase off of the latest develop, but again, this is not required.

Git Workflows

Spack is still in the beta stages of development. Most of our users run off of the develop branch, and fixes and new features are constantly being merged. So how do you keep up-to-date with upstream while maintaining your own local differences and contributing PRs to Spack?

Branching

The easiest way to contribute a pull request is to make all of your changes on new branches. Make sure your develop is up-to-date and create a new branch off of it:

$ git checkout develop
$ git pull upstream develop
$ git branch <descriptive_branch_name>
$ git checkout <descriptive_branch_name>

Here we assume that the local develop branch tracks the upstream develop branch of Spack. This is not a requirement and you could also do the same with remote branches. But for some it is more convenient to have a local branch that tracks upstream.

Normally we prefer that commits pertaining to a package <package-name> have a message <package-name>: descriptive message. It is important to add descriptive message so that others, who might be looking at your changes later (in a year or maybe two), would understand the rationale behind them.

Now, you can make your changes while keeping the develop branch pure. Edit a few files and commit them by running:

$ git add <files_to_be_part_of_the_commit>
$ git commit --message <descriptive_message_of_this_particular_commit>

Next, push it to your remote fork and create a PR:

$ git push origin <descriptive_branch_name> --set-upstream

GitHub provides a tutorial on how to file a pull request. When you send the request, make develop the destination branch.

If you need this change immediately and don't have time to wait for your PR to be merged, you can always work on this branch. But if you have multiple PRs, another option is to maintain a Frankenstein branch that combines all of your other branches:

$ git co develop
$ git branch <your_modified_develop_branch>
$ git checkout <your_modified_develop_branch>
$ git merge <descriptive_branch_name>

This can be done with each new PR you submit. Just make sure to keep this local branch up-to-date with upstream develop too.

Cherry-Picking

What if you made some changes to your local modified develop branch and already committed them, but later decided to contribute them to Spack? You can use cherry-picking to create a new branch with only these commits.

First, check out your local modified develop branch:

$ git checkout <your_modified_develop_branch>

Now, get the hashes of the commits you want from the output of:

$ git log

Next, create a new branch off of upstream develop and copy the commits that you want in your PR:

$ git checkout develop
$ git pull upstream develop
$ git branch <descriptive_branch_name>
$ git checkout <descriptive_branch_name>
$ git cherry-pick <hash>
$ git push origin <descriptive_branch_name> --set-upstream

Now you can create a PR from the web-interface of GitHub. The net result is as follows:

  1. You patched your local version of Spack and can use it further.

  2. You "cherry-picked" these changes in a stand-alone branch and submitted it as a PR upstream.

Should you have several commits to contribute, you could follow the same procedure by getting hashes of all of them and cherry-picking to the PR branch.

注釈

It is important that whenever you change something that might be of importance upstream, create a pull request as soon as possible. Do not wait for weeks/months to do this, because:

  1. you might forget why you modified certain files

  2. it could get difficult to isolate this change into a stand-alone clean PR.

Rebasing

Other developers are constantly making contributions to Spack, possibly on the same files that your PR changed. If their PR is merged before yours, it can create a merge conflict. This means that your PR can no longer be automatically merged without a chance of breaking your changes. In this case, you will be asked to rebase on top of the latest upstream develop.

First, make sure your develop branch is up-to-date:

$ git checkout develop
$ git pull upstream develop

Now, we need to switch to the branch you submitted for your PR and rebase it on top of develop:

$ git checkout <descriptive_branch_name>
$ git rebase develop

Git will likely ask you to resolve conflicts. Edit the file that it says can't be merged automatically and resolve the conflict. Then, run:

$ git add <file_that_could_not_be_merged>
$ git rebase --continue

You may have to repeat this process multiple times until all conflicts are resolved. Once this is done, simply force push your rebased branch to your remote fork:

$ git push --force origin <descriptive_branch_name>

Rebasing with cherry-pick

You can also perform a rebase using cherry-pick. First, create a temporary backup branch:

$ git checkout <descriptive_branch_name>
$ git branch tmp

If anything goes wrong, you can always go back to your tmp branch. Now, look at the logs and save the hashes of any commits you would like to keep:

$ git log

Next, go back to the original branch and reset it to develop. Before doing so, make sure that you local develop branch is up-to-date with upstream:

$ git checkout develop
$ git pull upstream develop
$ git checkout <descriptive_branch_name>
$ git reset --hard develop

Now you can cherry-pick relevant commits:

$ git cherry-pick <hash1>
$ git cherry-pick <hash2>

Push the modified branch to your fork:

$ git push --force origin <descriptive_branch_name>

If everything looks good, delete the backup branch:

$ git branch --delete --force tmp

Re-writing History

Sometimes you may end up on a branch that has diverged so much from develop that it cannot easily be rebased. If the current commits history is more of an experimental nature and only the net result is important, you may rewrite the history.

First, merge upstream develop and reset you branch to it. On the branch in question, run:

$ git merge develop
$ git reset develop

At this point your branch will point to the same commit as develop and thereby the two are indistinguishable. However, all the files that were previously modified will stay as such. In other words, you do not lose the changes you made. Changes can be reviewed by looking at diffs:

$ git status
$ git diff

The next step is to rewrite the history by adding files and creating commits:

$ git add <files_to_be_part_of_commit>
$ git commit --message <descriptive_message>

After all changed files are committed, you can push the branch to your fork and create a PR:

$ git push origin --set-upstream