Help developing#
This page describes some of the tools and conventions followed by Common Partial Wave Analysis. Where possible, we use the source code of the AmpForm repository as example, because its file structure is comparable to that of other ComPWA repositories.
Tip
To start developing, simply run the following from a cloned repository on your machine:
conda env create
conda activate ampform
pre-commit install --install-hooks
uv venv -p 3.10
source .venv/bin/activate
uv pip install -c .constraints/py3.10.txt -e '.[dev]'
pre-commit install --install-hooks
python3 -m venv ./venv
source ./venv/bin/activate
python3 -m pip install -c .constraints/py3.10.txt -e '.[dev]'
pre-commit install --install-hooks
Replace 3.10
with the Python version you use on your machine.
See Virtual environment for more info.
Local set-up#
Virtual environment#
When developing source code, it is safest to work within a virtual environment, so that all package dependencies and developer tools are safely contained. This is helpful in case something goes wrong with the dependencies: just trash the environment and recreate it. In addition, you can easily install other versions of the dependencies, without affecting other packages you may be working on.
Somme common tools to manage virtual environments are Conda, uv
and Pythonâs built-in venv
. In any of these cases, you have to activate the environment whenever you want to run the framework or use the developer tools.
Conda can be installed without administrator rights. It is recommended to download Miniconda, as it is much smaller than Anaconda. In addition, Conda can install more than just Python packages.
The virtual environment of repositories that provide a Conda environment file (environment.yml
), can simply be created with
conda env create
Conda now creates an environment with a name that is defined in the environment.yml
file. In addition, it will install the framework itself in âeditableâ mode, so that you can start developing right away.
If you have Pythonâs venv
, available on
your system, you can create a virtual environment with it. Navigate to some convenient
folder and run:
python3 -m venv ./venv
This creates a folder called venv
where all Python packages will be contained.
To activate the environment, run:
source ./venv/bin/activate
Now you can safely install the package you want to work on (see âeditableâ mode), as well as any additional required packages (see optional dependencies):
pip install -e .
An upcoming and super fast Python package installer is uv
. It can be installed without administrator rights and comes with its own venv
command to manage virtual environments. Hereâs how to do it for a specific Python version:[2]
uv venv -p 3.10
This creates a folder called .venv
where all Python packages will be contained. To activate the environment, run:
source .venv/bin/activate
You can safely install the package you want to work on (see âeditableâ mode), as well as any additional required packages (see optional dependencies):
uv pip install -e .
Tip
Install direnv
if a repository contains a .envrc
file. This ensures that the appropriate environment is activates once you navigate into the repository!
Editable installation#
When developing a package, it is most convenient if you install it in âeditableâ mode. This allows you to tweak the source code and try out new ideas immediately, because the source code is considered the âinstallationâ.
With pip install
, a package can be
installed in âeditableâ mode with the
-e
flag. Simply
clone
the repository you want to work on, navigate into it, and run:
python3 -m pip install -e .
Internally, this calls:
python3 setup.py develop
This will also install all dependencies required by the package.
Optional dependencies#
Some packages suggest
optional dependencies.
They can be installed with
pip
âs âextrasâ syntax.
Some examples would be:
pip install tensorwaves[jax,scipy]
pip install .[test] # local directory, not editable
pip install -e .[dev] # editable + all dev requirements
pip install "tensorwaves[jax,scipy]"
pip install ".[test]" # local directory, not editable
pip install -e ".[dev]" # editable + all dev requirements
Developers require several additional tools besides the dependencies required to run the package itself (see Automated coding conventions). All those additional requirements can be installed with the last example.
Pinning dependency versions#
To ensure that developers use exactly the same versions of the package dependencies and developer requirements, some of the repositories provide constraint files. These files can be used to âpinâ all versions of installed packages as follows:
python3 -m pip install -c .constraints/py3.10.txt -e .
The syntax works just as well for Optional dependencies:
python3 -m pip install -c .constraints/py3.10.txt -e .[doc,sty]
python3 -m pip install -c .constraints/py3.10.txt -e .[test]
python3 -m pip install -c .constraints/py3.10.txt -e .[dev]
python3 -m pip install -c .constraints/py3.10.txt -e ".[doc,sty]"
python3 -m pip install -c .constraints/py3.10.txt -e ".[test]"
python3 -m pip install -c .constraints/py3.10.txt -e ".[dev]"
The constraint files are updated automatically with
pip-tools
through
GitHub Actions. See
requirements-pr.yml
and
requirements-cron.yml
.
Note
Constraint files ensure that the framework is deterministic and reproducible (up to testing) for all commits and versions, which is vital for both users (doing analysis) and for developers (for instance with continuous integration). In other words, it provides a way out of âdependency hellâ.
Updating#
It may be that new commits in the repository modify the dependencies. In that case, you have to rerun this command after pulling new commits from the repository:
git checkout main
git pull
pip install -c .constraints/py3.10.txt -e .[dev]
git checkout main
git pull
pip install -c .constraints/py3.10.txt -e ".[dev]"
If you still have problems, it may be that certain dependencies have become redundant. In that case, trash the virtual environment and create a new one.
Julia#
Julia is an upcoming programming language in High-Energy Physics. While ComPWA is mainly developed in Python, we try to taylor to new trends and are experimenting with Julia as well.
Julia can be downloaded here or can be installed within your virtual environment with juliaup. To install Julia system-wide in Linux and Mac, youâll have to unpack the downloaded tar file to a location that is easily accessible. Hereâs an example, where we also make the Julia executable available to the system:
Install juliaup for installing and managing Julia versions.
conda install juliaup -c conda-forge
Optional: select Julia version
By default, this provides you with the latest Julia release. Optionally, you can switch versions as follows:
conda install juliaup -c conda-forge
juliaup add 1.9
juliaup default 1.9
You can switch back to the latest version with:
juliaup default release
cd ~/Downloads
tar xzf julia-1.10.3-linux-x86_64.tar.gz
mkdir ~/opt ~/bin
mv julia-1.10.3 ~/opt/
ln -s ~/opt/julia-1.10.3/bin/julia ~/bin/julia
Make sure that ~/bin
is listed in the PATH
environment variable, e.g. by updating it
through your .bashrc
file:
export PATH="~/bin:$PATH"
cd ~/Downloads
tar xzf julia-1.10.3-linux-x86_64.tar.gz
sudo mv julia-1.10.3 /opt/
sudo ln -s /opt/julia-1.10.3/bin/julia /usr/local/bin/julia
Just as in Python, itâs safest to work with a
virtual environment. You can read more about Julia
environments here. An environment is
defined through a
Project.toml
file (which
defines direct dependencies) and a
Manifest.toml
file
(which exactly pins the installed versions of all recursive dependencies). Donât touch
these filesâthey are automatically managed by the
package manager. It does make
sense though to commit both Project.toml
and Manifest.toml
files, so that the
environment is reproducible for each commit (see also
Pinning dependency versions).
See also
Have a look here if you want to integrate Jupyter notebooks with Julia kernels into your documentation.
Automated coding conventions#
Where possible, we define and enforce our coding conventions through automated tools, instead of describing them in documentation. These tools perform their checks when you commit files locally (see Pre-commit), when running tox, and when you make a pull request.
The tools are mainly configured through pyproject.toml
and the workflow files under .github
. These configuration files are kept up to date through the ComPWA/policy repository, which essentially defines the developer environment across all ComPWA repositories.
If you run into persistent linting errors, this may mean we need to further specify our conventions. In that case, itâs best to create an issue or a pull request at ComPWA/policy and propose a policy change that can be formulated through those config files.
Pre-commit#
All style checks are enforced through a tool called pre-commit. Itâs best to activate this tool locally as well. This has to be done only once, after you clone the repository:
pre-commit install --install-hooks
Upon committing, pre-commit runs a set of checks as defined in the file
.pre-commit-config.yaml
over all staged files. You can also quickly run all checks over all indexed files in
the repository with the command:
pre-commit run -a
Whenever you submit a pull request, this command is automatically run on GitHub actions and on pre-commit.ci , ensuring that all files in the repository follow the same conventions as set in the config files of these tools.
Tox#
More thorough checks can be run in one go with the following command:
tox -p
This command will run pytest
, perform all
style checks,
build the documentation, and verify cross-references in
the documentation and the API. Itâs especially recommended to run tox before
submitting a pull request!
More specialized tox job are defined in the
pyproject.toml
config file, under each
testenv
section under legacy_tox_ini
. You can list all environments, along with a description of what
they do, by running:
tox list
GitHub Actions#
All style checks, testing of the
documentation and links, and
unit tests are performed upon each pull request through
GitHub Actions (see status overview
here). The checks are defined under the
.github
folder. All checks
performed for each PR have to pass before the PR can be merged.
Style checks#
Formatting#
Formatters are tools that automatically format source code, or some document. Naturally, this speeds up your own programming, but these tools are particularly important when collaborating, because a standardized format avoids line conflicts in Git and makes diffs in code review easier to read.
For the Python source code, we use black
and
isort
(through
Ruff). For other code, we use
Prettier. All of these formatters are âopinionated formattersâ:
they offer only limited configuration options, as to make formatting as conform as
possible.
Pre-commit performs some additional formatting jobs. For instance, it
formats Jupyter notebooks with nbQA and strips them
of any output cells with nbstripout
.
Linting#
Linters point out when certain style conventions are not correctly followed. Unlike with formatters, you have to fix the errors yourself. As mentioned in Automated coding conventions, style conventions are formulated in config files. The main linter that ComPWA projects use, is Ruff.
Spelling#
Throughout this repository, we follow American English (en-us) spelling conventions. As a tool, we use cSpell, because it allows to check variable names in camel case and snake case. This way, a spelling checker helps you avoid mistakes in the code as well! cSpell is enforced through pre-commit.
Accepted words are tracked through the
.cspell.json
file. As with
the other config files,
.cspell.json
formulates
our conventions with regard to spelling and can be continuously updated while our code
base develops. In the file, the words
section lists words that you want to see as
suggested corrections, while ignoreWords
are just the words that wonât be flagged. Try
to be sparse in adding words: if some word is just specific to one file, you can
ignore it inline, or you can add the file
to the ignorePaths
section if you want to ignore it completely.
It is easiest to use cSpell in Visual Studio code, through the
Code Spell Checker
extension: it provides linting, suggests corrections from the words
section, and
enables you to quickly add or ignore words through the
.cspell.json
file.
Testing#
The fastest way to run all tests is with the command:
pytest -n auto
The flag -n auto causes pytest
to
run with a distributed strategy.
Try to keep test coverage high. You can compute current coverage by running
tox -e cov
and opening htmlcov/index.html
in a browser.
To get an idea of performance per component, run
pytest --profile-svg
and check the stats and the prof/combined.svg
output file.
Note
Jupyter notebooks can also be used as tests. See more info here.
Documentation#
The documentation that you find on ComPWA pages like pwa.rtfd.io is built with Sphinx. Sphinx also builds the API page of the packages and therefore checks whether the docstrings in the Python source code are valid and correctly interlinked.
We make use of Markedly Structured Text (MyST), so you can write the documentation in both Markdown and reStructuredText. In addition, itâs easy to write (interactive) code examples in Jupyter notebooks and host them on the website (see MyST-NB)!
Documentation preview#
You can quickly build the documentation with the command:
tox -e doc
If you are doing a lot of work on the documentation,
sphinx-autobuild
is a nice tool to use.
Just run:
tox -e doclive
This will start a server http://127.0.0.1:8000 where you can continuously preview the changes you make to the documentation.
Finally, a nice feature of Read the Docs, where we host our documentation, is that documentation is built for each pull request as well. This means that you can view the documentation for your changes as well. For more info, see here, or just click âdetailsâ under the RTD check once you submit your PR.
Jupyter Notebooks#
The docs folder can also contain Jupyter notebooks. These notebooks are rendered as HTML by MyST-NB. The notebooks are also run and tested whenever you make a pull request, so they also serve as integration tests.
If you want to improve those notebooks, we recommend working with Jupyter Lab, which is installed with the dev requirements. Jupyter Lab offers a nicer developer experience than the default Jupyter notebook editor does. A few useful Jupyter Lab plugins are also installed through the optional dependencies.
Now, if you want to test all notebooks in the documentation folder and check what their output cells will look like in the Documentation, you can do this with:
tox -e docnb
This command takes more time than tox -e doc
, but it is good practice to do this
before you submit a pull request. Itâs also possible to continuously generate the HTML
pages including cell output while you work on the notebooks with:
EXECUTE_NB= tox -e doclive
Tip
Notebooks are automatically formatted through pre-commit (see Formatting). If you want to format the notebooks automatically as youâre working, you can do so with jupyterlab-code-formatter
, which is automatically installed with the dev requirements.
For this, you need to set Ruff as the formatter (see Formatting) for jupyterlab-code-formatter
:
Open Jupyter Lab Settings (Ctrl+, / â+,).
Select the âJupyterlab Code Formatterâ plugin from the left sidebar.
Under
default_formatter
, list two Python formatters:ruffformat
andruff
. The first is the actual Ruff formatter that mimicsblack
, the second is for Ruff auto-fixes of linting rules. (The default formatters areisort
andblack
âyou should overwrite these.)Optional: Activate format-on-save by ticking âAuto format configâ.
Alternatively, you can open the settings for the âJupyterlab Code Formatterâ as JSON and set it to:
{
"preferences": {
"default_formatter": {
"python": ["ruffformat", "ruff"]
}
},
"formatOnSave": true
}
IJulia notebooks#
Itâs also possible to execute and render Jupyter notebooks with Julia kernels. For this, install Julia and install IJulia:
julia -e 'import Pkg; Pkg.add("IJulia")'
import Pkg
Pkg.add("IJulia")
Usually, this also installs a Jupyter kernel directly. Optionally, you can define a Jupyter kernel manually:
julia -e 'using IJulia; installkernel("julia")'
using IJulia
installkernel("julia")
and select it as kernel in the Jupyter notebook.
Note
As mentioned in Julia, Julia can be installed within your Conda
environment through juliaup
. This is,
however, not yet a virtual environment for Julia itself. You can create a virtual
environment for Julia itself by for instance defining it through a code cell like this:
using Pkg
Pkg.activate(".") # if environment is defined in this folder
Pkg.instantiate()
See TR-019 for an example.
Additionally, you can install a Language Server for Julia in Jupyter Lab. To do so, run:
julia -e 'import Pkg; Pkg.add("LanguageServer")'
using Pkg
Pkg.add("LanguageServer")
Collaboration#
The source code of all ComPWA repositories is maintained with Git and GitHub. We keep track of issues with the code, documentation, and developer set-up with GitHub issues (see for instance here). This is also the place where you can report bugs.
Tip
If you are new to working with GitHub, have a look at the tutorials on GitHub Skills. For good tutorials on Git, see:
Issue management#
We keep track of issue dependencies, time estimates, planning, pipeline statuses, et cetera with GitHub project boards (GitHub Issues). The main project boards are:
Some issues are not public. To get access, you can request to become member of the ComPWA GitHub organization. Other information that is publicly available are:
Issue labels: help to categorize issues by type (maintenance, enhancement, bug, etc.). The labels are also used to in the sub-sections of the release notes.
Milestones: way to bundle issues and PRs for upcoming releases.
All of these are important for the Release flow and therefore also serve as a way to document the framework.
Branching model#
While our aim is to maintain long-term, stable projects, PWA software projects are academic projects that are subject to change and often require swift modifications or new features for ongoing analyses. For this reason, we work in different layers of development. These layers are represented by Git branches.
stable
branch#
Represents the latest release of the package that can be found on both the GitHub
release page and on PyPI (see Release flow). The documentation of the
stable
branch is also the default view
you see on Read the Docs (RTD). See e.g.
ampform.rtfd.io/en/stable.
main
branch#
Represents the upcoming release of the package. This branch is not guaranteed to be
stable, but has high CI standards and can only be
updated through reviewed pull requests. The documentation of the main
branch can be
found on RTD under âlatestâ, see e.g.
ampform.rtfd.io/en/latest.
Epic branches#
When working on a feature or larger refactoring that may take a longer time (think of
implementing a new PWA formalism), we isolate its development under an âepic branchâ,
separate from the main
branch. Eventually, this epic branch is to be merged back into
the main
, until then it is available for discussion and testing.
Pull requests to an epic branch require no code review and the CI checks are less strict. This allows for faster development, while still offering the possibility to discuss new implementations and keeping track of related issues.
Epic branches can be installed through PyPI as well. Say that a certain epic is located
under the branch epic/some-title
and that the source code is located under
https://github.com/ComPWA/ampform, it can be
installed as follows:
python3 -m pip install git+https://github.com/ComPWA/ampform@epic/some-title
Feature branches#
The main branch and Epic branches can be updated through pull requests. It is best to create such a pull request from a separate branch, which does not have any CI or code review restrictions. We call this a âfeature branchâ.
Commit conventions#
Please use conventional commit messages: start the commit with one of the semantic keywords below in UPPER CASE, followed by a column, then the commit header. The message itself should be in imperative mood â just imagine the commit to give a command to the code framework. So for instance:
DX: implement coverage report tools FIX: remove typo in raised `ValueError` message MAINT: remove redundant print statements DOC: rewrite welcome pages BREAK: remove `formulate_model()` alias method
The allowed semantic keywords (commit types) are as follows:[1]
Description
FEAT
New feature added to the package
ENH
Improvements and optimizations of existing features
FIX
Bug has been fixed
BREAK
Breaking changes to the API
BEHAVIOR
Changes that may affect the framework output
DOC
Improvements or additions to documentation
MAINT
Maintenance and upkeep improvements
DX
Improvements to the Developer Experience
Keep pull requests small. If the issue you try to address is too big, discuss in the team whether the issue can be converted into an Epic and split up into smaller tasks.
Before creating a pull request, run Tox.
Also use a conventional commit message style for the PR title. This is because we follow a linear commit history and the PR title will become the eventual commit message. A linear commit history is important for the Release flow and it is easier to navigate through changes once something goes wrong. In fact, in a linear commit history, commits that a have been merged into the main branch become more like small intermediate patches between the minor and major releases.
Note that a conventional commit message style is enforced through GitHub Actions with
commitlint
, as well as a check on PR labels (see example here). The commit messages are centrally defined for the ComPWA organization at ComPWA/commitlint-config.PRs can only be merged through âsquash and mergeâ. There, you will see a summary based on the separate commits that constitute this PR. Leave the relevant commits in as bullet points. See the commit history for examples. This comes in especially handy when drafting a release!
Release flow#
Releases are managed with the
GitHub release page,
see for instance the one for AmpForm. The
release notes there are
automatically generated from the PRs
that were merged into the main branch since the previous tag and can be viewed and
edited as a release draft if you are a member of the ComPWA organization. Each of the
entries are generated from the PR titles, categorized by issue label (see configuration
in
.github/release-drafter.yml
).
Once a release is made on GitHub for a repository with source code for a Python package,
a new version is automatically published on PyPI and the
stable
branch is updated to this latest tag. The package version is
taken from the Git tag associated with the release on GitHub (see
setuptools-scm). This way, the release notes
on GitHub serve as a changelog as well!
Release tags have to follow the Semantic Versioning scheme! This ensures that the tag can be used by setuptools-scm (in case the repository is a Python package). In addition, milestones with the same name as the release tag are automatically closed.
Code editors#
Even though we try to standardize the developer set-up of the repositories, we encourage
you to use the code editors that you feel comfortable with. Where possible, we therefore
define settings of linters, formatters, etc in config files that are specific to those
tools (using pyproject.toml
where possible), not in the configuration files of the
editors.
Still, where code editor settings can be shared through configuration files in the repository, we provide recommended settings for the code editor as well. This is especially the case for VSCode.
Tip
We are open to other code editors as well. An example would be maintaining a local vimrc for users who prefer VIM. Other IDEs weâd like to support are PyCharm, Atom, IntelliJ with Python. So weâll gladly integrate your editor settings where possible as you contribute to the frameworks!
Visual Studio code#
We recommend using Visual Studio Code as itâs free, regularly updated, and very flexible through itâs wide offer of user extensions.
If you add or open this repository as a
VSCode workspace, the
file
.vscode/settings.json
will ensure that you have the right developer settings for this repository. In addition,
VSCode will automatically recommend you to install a number of extensions that we use
when working on this code base.
They are defined
in the
.vscode/extensions.json
file.
You can still specify your own settings in either the user or encompassing workspace settings, as the VSCode settings that come with this are folder settings.
Conda and VSCode
ComPWA projects are best developed with Conda and VSCode. The complete developer install procedure then becomes:
git clone https://github.com/ComPWA/ampform # or some other repo
cd ampform
conda env create
conda activate pwa # or whatever the environment name is
code . # open folder in VSCode
Writing durable software#
ComPWA strives to follow best practices from software development in industry. Following these standards not only makes the code easier to maintain and the software more reliable, it also provides you with the opportunity to learn about these practices while developing the code-base. Below you can find some resources we highly recommend you to be familiar with.
Software development in Python
Scientific Python Library Development Guide for scientists and research software engineers.
Effective Python: 90 Specific Ways to Write Better Python (2019) by Brett Slatkin [Slatkin, 2019]. See also its website.
Clean Code
Clean Code: A Handbook of Agile Software Craftsmanship (2009) by Robert Martin (âUncle Bobâ) [Martin, 2009]
This gist with a comprehensive summary of the core principles of Martinâs Clean Code
Test-Driven Development
Test-Driven Development with Python (2017) by Harry Percival [Percival, 2017]
The classic: Test-Driven Development by Example (2002) by Kent Beck [Beck, 2003]
Types of Software Testing: 100 Examples of Different Testing Types â Guru99
Software Design
Composition over inheritance: Subclassing in Python Redux by Hynek Schlawack. A comprehensive article on the topic with illustrative examples in Python an several references to other important articles.
The classic âGang of Fourâ (GOF) book: Design Patterns (1994) by Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides [Gamma, 1995]
C++ Core Guidelines: while this document provides intended for C++ developers, it is an excellent, up-to-date set of guidelines that apply to any programming language.
Algorithms
Algorithms (2011) by Robert Sedgewick and Kevin Wayne [Sedgewick and Wayne, 2011]. See also their website, with several links to other resources like their Coursera course.
LeetCode: practice algorithms through coding problems
Bibliography#
Tip
Download this bibliography as BibTeX here
.
B. Slatkin. Effective Python: 90 Specific Ways to Write Better Python. Addison-Wesley, November 2019. ISBN:978-0-13-485398-7.
R. C. Martin, editor. Clean Code: A Handbook of Agile Software Craftsmanship. Prentice Hall, Upper Saddle River, NJ, 2009. ISBN:978-0-13-235088-4.
H. Percival. Test-Driven Development with Python: Obey the Testing Goat: Using Django, Selenium, and JavaScript. O'Reilly Media, Sebastopol, CA, second edition edition, 2017. ISBN:978-1-4919-5870-4.
K. Beck. Test-Driven Development by Example. The Addison-Wesley Signature Series. Addison-Wesley, Boston, 2003. ISBN:978-0-321-14653-3.
E. Gamma, editor. Design Patterns: Elements of Reusable Object-Oriented Software. Addison-Wesley Professional Computing Series. Addison-Wesley, Reading, Mass, 1995. ISBN:978-0-201-63361-0.
R. Sedgewick and K. D. Wayne. Algorithms. Addison-Wesley, Upper Saddle River, NJ, 4th edition, 2011. ISBN:978-0-321-57351-3.