Development guide#

This guide will outline the basic process for developing the project locally, and the recommended tools and workflows to use when making changes to the codebase.

Development setup#

Important

If you only want to use the package, the simplest option is to install it from PyPI (https://pypi.org/project/snf-simulations/).

The setup steps below are intended for anyone wanting to make new additions or changes to the codebase.

Alternative: venv + pip#

You can still install the package for development using the traditional tools. Creating a virtual environment is highly recommended to isolate the development environment from the system Python. Note the pip command installs the package in editable mode, again with the dev dependency group to include development-only dependencies.

git clone https://github.com/ekneale/SNF-simulations.git
cd SNF-simulations
python -m venv .venv
source .venv/bin/activate
python -m pip install --upgrade pip
pip install -e .[dev]

Repository layout#

This project has the following structure:

  • src/snf_simulations/: installable package code.

  • src/snf_simulations/data/: packaged data files used at runtime.

  • src/snf_simulations/scripts/: command-line scripts (accessed through entry points defined in pyproject.toml).

  • tests/: unit and integration tests.

  • docs/: Sphinx + MyST documentation source.

  • .github/workflows/: GitHub Actions CI workflows.

Development tools#

Ruff (linting and formatting)#

Ruff is configured in pyproject.toml and used for style and static lint checks. Ruff will be run automatically on every commit (using prek) and pull request (using GitHub Actions), but you can also run it locally to check for any issues before pushing your changes.

If you have the development environment set up with uv, you can run:

uv run ruff check

to check for any linting issues in the source code, and

uv run ruff format

to automatically fix any formatting issues. Using the uv run prefix ensures you’re using the version of Ruff installed in the development environment, which is important for consistency across contributors and CI.

If not, you can run ruff directly from the command line (the same is true for the other tools mentioned below), for example:

ruff check

ty (type checking)#

Type checking is handled with ty.

With uv you can run the type checker with:

uv run ty check

Important

Ty is still in early development, which is why it is not included in the automatic pre-commit or CI checks, but it is still recommended to run it locally when developing new classes or functions.

pytest (testing)#

Tests are run with pytest, with optional coverage reporting using pytest-cov.

With uv, you can run the tests with:

uv run pytest tests/

This will run all the tests in the tests/ directory, checking that they all complete correctly and flagging any failures. You can also specify a particular test file or test function to run, for example:

uv run pytest tests/test_example.py::test_function

This can be useful for running a smaller subset of tests while developing new code or debugging specific issues.

In order to include code coverage information, you can run:

uv run pytest tests/ --cov=snf_simulations

This will generate a coverage report showing which lines of code are covered by the tests, which can help identify any gaps in test coverage that may need to be addressed.

Prek (pre-commit checks)#

prek automatically runs various checks before changes are committed to Git, which which means only compliant code will be committed.

The included hooks are defined in prek.toml. Prek comes with several built-in checks, for things like ensuring correct file endings and preventing large files from being committed (see https://prek.j178.dev/builtin/). The current config also includes a hook for ruff check and ruff format for linting and formatting (see https://github.com/astral-sh/ruff-pre-commit). Currently type checking with ty is not yet included, as there is no pre-commit hook available (see https://github.com/astral-sh/ty/issues/269).

As the test functions using pytest can be time-consuming to run, following standard practice they are also not included in the pre-commit hooks. It is recommended you run pytest manually before pushing any changes, although they will be run automatically by the CI pipeline on GitHub.

To run the pre-commit checks manually, you can use the prek command directly or through uv. By default prek run will run the checks on all staged files, but you can also include the --all-files flag to run the checks on all files in the repository:

uv run prek run

In practice, prek is best installed as a pre-commit hook into the local Git repository, so that it runs automatically whenever you commit changes. To set up the pre-commit hook, you can run:

uv run prek install

Once this is installed, the checks will run automatically whenever you commit changes with Git. For any of the formatting hooks, attempted fixes will be automatically staged for commit, so you can just review the changes and commit as normal. If any checks fail, the commit will be blocked until the issues are resolved.

GitHub Actions (continuous integration)#

Source control, issues, pull requests, and releases are managed on GitHub, as described in the contributing guide.

The project currently uses GitHub Actions to automate workflows for linting, testing, documentation, and publishing. The workflow files are located in .github/workflows/. The main continuous integration (CI) workflows run on every pull request or change to the codebase, and merging PRs will be blocked if any checks fail.

The outcomes of all the triggered actions can be viewed in the “Actions” tab of the GitHub repository https://github.com/ekneale/SNF-simulations/actions, where you can see the logs for each workflow run. When a pull request is opened, you can also see the status of the checks directly on the PR page, and click through to view the logs for any failed checks.

A notable action that is not defined in the repository is pages-build-deployment, which handles the automatic documentation build and deployment to GitHub Pages (see below). Whenever changes are merged into the main branch, the documentation will be automatically rebuilt and published to https://ekneale.github.io/SNF-simulations/, so any changes to the codebase that affect the documentation should also be reflected in the live documentation site.

Sphinx (documentation)#

Documentation is built with Sphinx and written in Markdown using MyST.

To build the documentation locally, you can run:

cd docs
make html

The generated HTML files will be located in docs/_build/html/, and you can open index.html in a web browser to view the documentation.

The API documentation is automatically generated from docstrings using Sphinx’s autodoc2 extension, so any new functions or classes you add should be properly documented with docstrings to ensure they are included in the API reference pages.

As described above, when changes are made to the source code on GitHub the documentation will be automatically rebuilt and published to a site hosted on GitHub Pages: https://ekneale.github.io/SNF-simulations/. This ensures that the live documentation site is always up to date with the latest code changes.

The documentation pages use the PyData Sphinx Theme.

Releasing new versions#

Important

This is only relevant for maintainers of the project.

Package releases are created through the GitHub Releases system. New releases are published to PyPI automatically by the GitHub Actions publish.yml workflow.

Versions are marked as tags created through the GitHub Release flow (rather than creating tags separately), so release metadata, tags, and automated publishing stay aligned. The local Python package versioning is managed through setuptools-scm.