Contributing¶
Summary¶
PRs welcome!
- Consider starting a discussion to see if there's interest in what you want to do.
- Submit PRs from feature branches on forks to the
develop
branch. - Ensure PRs pass all CI checks.
- Maintain test coverage at 100%.
Git¶
- Why use Git? Git enables creation of multiple versions of a code repository called branches, with the ability to track and undo changes in detail.
- Install Git by downloading from the website, or with a package manager like Homebrew.
- Configure Git to connect to GitHub with SSH.
- Fork this repo.
- Create a branch in your fork.
- Commit your changes with a properly-formatted Git commit message.
- Create a pull request (PR) to incorporate your changes into the upstream project you forked.
Python¶
Hatch¶
This project uses Hatch for dependency management and packaging.
Highlights¶
- Automatic virtual environment management: Hatch automatically manages the application environment.
- Dependency resolution: Hatch will automatically resolve any dependency version conflicts using the
pip
dependency resolver. - Dependency separation: Hatch supports separate lists of optional dependencies in the pyproject.toml. Production installs can skip optional dependencies for speed.
- Builds: Hatch has features for easily building the project into a Python package and publishing the package to PyPI.
Installation¶
Hatch can be installed with Homebrew or pipx
.
Install project with all dependencies: hatch env create
.
Key commands¶
# Basic usage: https://hatch.pypa.io/latest/cli/reference/
hatch env create # create virtual environment and install dependencies
hatch env find # show path to virtual environment
hatch env show # show info about available virtual environments
hatch run COMMAND # run a command within the virtual environment
hatch shell # activate the virtual environment, like source venv/bin/activate
hatch version # list or update version of this package
export HATCH_ENV_TYPE_VIRTUAL_PATH=.venv # install virtualenvs into .venv
Running the development server¶
The easiest way to get started is to run the development server locally with the VSCode debugger. The debugger config is stored in launch.json. After installing the virtual environment as described above, start the debugger. Uvicorn enables hot-reloading and addition of debug breakpoints while the server is running. The Microsoft VSCode Python extension also offers a FastAPI debugger configuration, added in version 2020.12.0, which has been customized and included in launch.json. To use it, simply select the FastAPI config and start the debugger.
As explained in the VSCode docs, if developing on Linux, note that non-root users may not be able to expose ports less than 1024.
Testing with pytest¶
- Tests are in the tests/ directory.
- Run tests by invoking
pytest
from the command-line within the virtual environment in the root directory of the repo. - pytest features used include:
- pytest plugins include:
- pytest configuration is in pyproject.toml.
- FastAPI testing and Starlette testing rely on the Starlette
TestClient
. - Test coverage reports are generated by coverage.py. To generate test coverage reports, first run tests with
coverage run
, then generate a report withcoverage report
. To see interactive HTML coverage reports, runcoverage html
instead ofcoverage report
.
Docker¶
Docker basics¶
- Docker is a technology for running lightweight virtual machines called containers.
- An image is the executable set of files read by Docker.
- A container is a running image.
- The Dockerfile tells Docker how to build the container.
- To get started with Docker:
- Ubuntu Linux: follow the instructions for Ubuntu Linux, making sure to follow the postinstallation steps to activate the Docker daemon.
- macOS and Windows: install Docker Desktop (available via Homebrew with
brew install --cask docker
).
Expand this details element for more useful Docker commands.
# Log in with Docker Hub credentials to pull images
docker login
# List images
docker images
# List running containers: can also use `docker container ls`
docker ps
# View logs for the most recently started container
docker logs -f $(docker ps -q -n 1)
# View logs for all running containers
docker logs -f $(docker ps -aq)
# Inspect a container (web in this example) and return the IP Address
docker inspect web | grep IPAddress
# Stop a container
docker stop # container hash
# Stop all running containers
docker stop $(docker ps -aq)
# Remove a downloaded image
docker image rm # image hash or name
# Remove a container
docker container rm # container hash
# Prune images
docker image prune
# Prune stopped containers (completely wipes them and resets their state)
docker container prune
# Prune everything
docker system prune
# Open a shell in the most recently started container (like SSH)
docker exec -it $(docker ps -q -n 1) /bin/bash
# Or, connect as root:
docker exec -u 0 -it $(docker ps -q -n 1) /bin/bash
# Copy file to/from container:
docker cp [container_name]:/path/to/file destination.file
Building development images¶
Note that Docker builds use BuildKit. See the BuildKit docs and Docker docs.
To build the Docker images for each stage:
git clone git@github.com:br3ndonland/inboard.git
cd inboard
export DOCKER_BUILDKIT=1
docker build . --rm --target base -t localhost/br3ndonland/inboard:base && \
docker build . --rm --target fastapi -t localhost/br3ndonland/inboard:fastapi && \
docker build . --rm --target starlette -t localhost/br3ndonland/inboard:starlette
Running development containers¶
# Run Docker container with Uvicorn and reloading
cd inboard
docker run -d -p 80:80 \
-e "BASIC_AUTH_USERNAME=test_user" \
-e "BASIC_AUTH_PASSWORD=r4ndom_bUt_memorable" \
-e "LOG_LEVEL=debug" \
-e "PROCESS_MANAGER=uvicorn" \
-e "WITH_RELOAD=true" \
-v $(pwd)/inboard:/app/inboard localhost/br3ndonland/inboard:base
docker run -d -p 80:80 \
-e "BASIC_AUTH_USERNAME=test_user" \
-e "BASIC_AUTH_PASSWORD=r4ndom_bUt_memorable" \
-e "LOG_LEVEL=debug" \
-e "PROCESS_MANAGER=uvicorn" \
-e "WITH_RELOAD=true" \
-v $(pwd)/inboard:/app/inboard localhost/br3ndonland/inboard:fastapi
docker run -d -p 80:80 \
-e "BASIC_AUTH_USERNAME=test_user" \
-e "BASIC_AUTH_PASSWORD=r4ndom_bUt_memorable" \
-e "LOG_LEVEL=debug" \
-e "PROCESS_MANAGER=uvicorn" \
-e "WITH_RELOAD=true" \
-v $(pwd)/inboard:/app/inboard localhost/br3ndonland/inboard:starlette
# Run Docker container with Gunicorn and Uvicorn
docker run -d -p 80:80 \
-e "BASIC_AUTH_USERNAME=test_user" \
-e "BASIC_AUTH_PASSWORD=r4ndom_bUt_memorable" \
localhost/br3ndonland/inboard:base
docker run -d -p 80:80 \
-e "BASIC_AUTH_USERNAME=test_user" \
-e "BASIC_AUTH_PASSWORD=r4ndom_bUt_memorable" \
localhost/br3ndonland/inboard:fastapi
docker run -d -p 80:80 \
-e "BASIC_AUTH_USERNAME=test_user" \
-e "BASIC_AUTH_PASSWORD=r4ndom_bUt_memorable" \
localhost/br3ndonland/inboard:starlette
# Test HTTP Basic auth when running the FastAPI or Starlette images:
http :80/status -a test_user:r4ndom_bUt_memorable
Change the port numbers to run multiple containers simultaneously (-p 81:80
).
Code quality¶
Code style¶
- Python code is formatted with Black. Configuration for Black is stored in pyproject.toml.
- Python imports are organized automatically with isort.
- The isort package organizes imports in three sections:
- Standard library
- Dependencies
- Project
- Within each of those groups,
import
statements occur first, thenfrom
statements, in alphabetical order. - You can run isort from the command line with
hatch run isort .
. - Configuration for isort is stored in pyproject.toml.
- The isort package organizes imports in three sections:
- Other web code (JSON, Markdown, YAML) is formatted with Prettier.
Static type checking¶
- To learn type annotation basics, see the Python typing module docs, Python type annotations how-to, the Real Python type checking tutorial, and this gist.
- Type annotations are not used at runtime. The standard library
typing
module includes aTYPE_CHECKING
constant that isFalse
at runtime, butTrue
when conducting static type checking prior to runtime. Type imports are included underif TYPE_CHECKING:
conditions so that they are not imported at runtime. These conditions are ignored when calculating test coverage. - Type annotations can be provided inline or in separate stub files. Much of the Python standard library is annotated with stubs. For example, the Python standard library
logging.config
module uses type stubs. The typeshed types for thelogging.config
module are used solely for type-checking usage of thelogging.config
module itself. They cannot be imported and used to type annotate other modules. - The standard library
typing
module includes aNoReturn
type. This would seem useful for unreachable code, including functions that do not return a value, such as test functions. Unfortunately mypy reports an error when usingNoReturn
, "Implicit return in function which does not return (misc)." To avoid headaches from the opaque "misc" category of mypy errors, these functions are annotated as returningNone
. - Mypy is used for type-checking. Mypy configuration is included in pyproject.toml.
- Mypy strict mode is enabled. Strict includes
--no-explicit-reexport
(implicit_reexport = false
), which means that objects imported into a module will not be re-exported for import into other modules. Imports can be made into explicit exports with the syntaxfrom module import x as x
(i.e., changing fromimport logging
toimport logging as logging
), or by including imports in__all__
. This explicit import syntax can be confusing. Another option is to apply mypy overrides to any modules that need to leverage implicit exports.
Pre-commit¶
Pre-commit runs Git hooks. Configuration is stored in .pre-commit-config.yaml. It can run locally before each commit (hence "pre-commit"), or on different Git events like pre-push
. Pre-commit is installed in the Python virtual environment. To use:
~
❯ cd path/to/inboard
path/to/inboard
❯ hatch env create
path/to/inboard
❯ hatch shell
# install hooks that run before each commit
path/to/inboard
.venv ❯ pre-commit install
# and/or install hooks that run before each push
path/to/inboard
.venv ❯ pre-commit install --hook-type pre-push
Spell check¶
Spell check is performed with CSpell.
In GitHub Actions, CSpell runs using cspell-action.
To run spell check locally, consider installing their VSCode extension or running from the command line.
CSpell can be run with pnpm
if pnpm
is installed:
pnpm -s dlx cspell --dot --gitignore "**/*.md"
or with npx
if npm
is installed:
npx -s -y cspell --dot --gitignore "**/*.md"
CSpell also offers a pre-commit hook through their cspell-cli repo. A .pre-commit-config.yaml
configuration could look like this:
repos:
- repo: https://github.com/streetsidesoftware/cspell-cli
rev: v6.16.0
hooks:
- id: cspell
files: "^.*.md$"
args: ["--dot", "--gitignore", "**/*.md"]
CSpell is not currently used with pre-commit in this project because behavior of the pre-commit hook is inconsistent.
- CSpell matches files with globs, but pre-commit uses Python regexes. Two separate file patterns have to be specified (a regex for pre-commit and a glob for CSpell), which is awkward and error-prone.
- When running with pre-commit, CSpell seems to output each error multiple times.
GitHub Actions workflows¶
GitHub Actions is a continuous integration/continuous deployment (CI/CD) service that runs on GitHub repos. It replaces other services like Travis CI. Actions are grouped into workflows and stored in .github/workflows. See Getting the Gist of GitHub Actions for more info.
Maintainers¶
Merges¶
- The default branch is
develop
. - PRs should be merged into
develop
. Head branches are deleted automatically after PRs are merged. - The only merges to
main
should be fast-forward merges fromdevelop
. - Branch protection is enabled on
develop
andmain
.develop
:- Require signed commits
- Include administrators
- Allow force pushes
main
:- Require signed commits
- Include administrators
- Do not allow force pushes
- Require status checks to pass before merging (commits must have previously been pushed to
develop
and passed all checks)
Releases¶
- Each minor version release of FastAPI or Uvicorn should get its own corresponding minor version release of inboard. FastAPI depends on Starlette, so any updates to the Starlette version required by FastAPI should be included with updates to the FastAPI version.
- To create a release:
- Bump the version number in
inboard.__version__
withhatch version
and commit the changes todevelop
.- Follow SemVer guidelines when choosing a version number. Note that PEP 440 Python version specifiers and SemVer version specifiers differ, particularly with regard to specifying prereleases. Use syntax compatible with both.
- The PEP 440 default (like
1.0.0a0
) is different from SemVer. Hatch and PyPI will use this syntax by default. - An alternative form of the Python prerelease syntax permitted in PEP 440 (like
1.0.0-alpha.0
) is compatible with SemVer, and this form should be used when tagging releases. As Hatch uses PEP 440 syntax by default, prerelease versions need to be written directly intoinboard.__version__
. - Examples of acceptable tag names:
1.0.0
,1.0.0-alpha.0
,1.0.0-beta.1
- Push to
develop
and verify all CI checks pass. - Fast-forward merge to
main
, push, and verify all CI checks pass. - Create an annotated and signed Git tag.
- List PRs and commits in the tag message:
git log --pretty=format:"- %s (%h)" \ "$(git describe --abbrev=0 --tags)"..HEAD
- Omit the leading
v
(use1.0.0
instead ofv1.0.0
) - Example:
git tag -a -s 1.0.0
- List PRs and commits in the tag message:
- Push the tag. GitHub Actions will build and push the Python package and Docker images, and open a PR to update the changelog.
- Squash and merge the changelog PR, removing any
Co-authored-by
trailers before merging.
- Bump the version number in
Deployments¶
Documentation is built with Material for MkDocs, deployed on Vercel, and available at inboard.bws.bio and inboard.vercel.app.
- Build command:
python3 -m pip install 'mkdocs-material>=9,<10' && mkdocs build --site-dir public
. The version ofmkdocs-material
installed on Vercel is independent of the version listed in pyproject.toml. If the version ofmkdocs-material
is updated in pyproject.toml, it must also be updated in the Vercel build configuration. - Output directory:
public
(default)
Vercel site configuration is specified in vercel.json.