This is my personal python repository bootstrap.
Feel free to use it as a launching point for your next project!
https://github.com/tlambert03/pyrepo-cookiecutter/tree/simple
For simplified/minimal starter package see the simple branch |
---|
I recommend using cruft
instead of
cookiecutter (this will let you update it easily later)
pip install cruft
cruft create https://github.com/tlambert03/pyrepo-cookiecutter
or you can use cookiecutter as usual:
pip install cookiecutter
cookiecutter https://github.com/tlambert03/pyrepo-cookiecutter
After creating the repo, you'll want to initialize a git repo.
This is important: you won't be able to
run pip install -e .
without runninggit init
cd <your-package-name>
git init
git add .
git commit -m 'build: Initial Commit'
Optionally, install pre-commit:
pip install pre-commit
pre-commit autoupdate
pre-commit install
git add .
git commit -m 'chore: update pre-commit'
To run tests locally, you'll need to install the package in editable mode.
I like to first create a new environment dedicated to my package:
mamba create -n <your-package-name> python
mamba activate <your-package-name>
Then install the package in editable mode:
pip install -e .[test]
if you run into problems here, make sure that you ran git init above!
Finally, run the tests:
pytest
If you have the GitHub CLI installed, and would like to create a GitHub repository for your new package:
gh repo create --source=. --public --remote=origin --push
alternatively, you can follow github's guide for adding a local repository to github
- If you'd like: setup the pre-commit.ci service to
run all pre-commit checks on every PR (in case contributors aren't running it
locally). Note that you can always run checks locally with
pre-commit run -a
- Follow links below for more info on the included tools (pay particular attention to hatch and ruff).
- See how to Deploy to PyPI below.
- PEP 517 build system with hatch
backend
- build with
python -m build
, notpython setup.py
!
- build with
- PEP 621 metadata in
pyproject.toml
- all additional configurables are also in
pyproject.toml
, with links to documentation
- all additional configurables are also in
- uses
src
layout (How come?) - git tag-based versioning with hatch-vcs
- autodeploy to PyPI on tagged commit (set
TWINE_API_KEY
env var on github). See Deploying to PyPI below. - Testing with pytest
- CI & testing with github actions
- GitHub action
cron-job
running tests against dependency pre-releases (using
--pre
to install dependencies). - pre-commit with
- ruff - amazing linter and
formatter. Takes the place of
flake8
,autoflake
,isort
,pyupgrade
, and more... - black - opinionated code formatter
- mypy - static type hint checker (defaults
to
strict
mode) - conventional-pre-commit - enforce good commit messages (this is commented out by default). See Conventional Commits below.
- ruff - amazing linter and
formatter. Takes the place of
check-manifest
test to check completeness of files in your release.- I use and include github-changelog-generator to automate changelog generation... but there are probably better options now (this is a hot topic).
When I'm ready to deploy a version, I tag the commit with a version number and
push it to github. This will trigger a github action that will build and deploy
to PyPI. (see the "deploy" step in workflows/ci.yml
). The version number is determined by the git tag using
hatch-vcs... which wraps
setuptools-scm
To auto-deploy to PyPI, you'll need to set a TWINE_API_KEY
environment
variable in your github repo settings. You can get this key from your pypi
account. Then add it to your github
repository settings as a secret named TWINE_API_KEY
. (see github
docs)
(the name TWINE_API_KEY
is specified in workflows/ci.yml
)
git tag -a v0.1.0 -m v0.1.0
git push --follow-tags
# or, specify a remote:
# git push upstream --follow-tags
if you're curious, see also some thoughts on semantic releases below
Conventional Commits is a specification for adding human and machine readable meaning to commit messages. Using it faithfully will allow you to automate a lot of things (changelogs, versioning, ...) at release time. To use it here:
-
Use the
conventional-pre-commit
step in pre-commit. It will force you to use conventional commits locally. -
[VS Code]: Add the Conventional Commits extension, making it easier to create conventional commits.
-
This still doesn't protect GitHub PR commits, so add the Semantic PRs GitHub App to check that PR titles follow the Convention Commits Spec (and require squash commits)
-
Protect the
main
branch:- use only PRs for
main
- use squash merge
- require the
Semantic PRs
check to pass for merging - consider allowing only
semantic-release
to push to that branch.
- use only PRs for
-
Future: use frappucino or griffe to detect breaking API changes & add a GitHub action to enforce a
!
in the title orBREAKING CHANGE
footer.
I'm still not sure how I feel about SemVer. Seems better than nothing, but also totally broken. I highly recommend these articles for insight:
- Semantic Versioning Will Not Save You - Hynek Schlawack
- Version numbers: how to use them? - Bernát Gábor
- Why I don't like SemVer anymore - Brett Cannon
- Should You Use Upper Bound Version Constraints? - Henry Schreiner
- Versioning Software - Donald Stufft
One of the biggest problems with SemVer is humans implementing it (see
ZeroVer 😂). One approach is to use fully-automated version
& release management to take the human out of it.
semantic-release is
popular in the javascript world, and a
python-semantic-release
variant exists. If you want to try it, this repo configures that in
pyproject.toml
:
- Set up
python-semantic-release
on GitHub Actions (seeci.yml
)
This template may change over time, bringing in new improvements, fixes, and
updates. To update an existing project that was created from this template
using cruft, run cruft update
in the root of the project. See cruft
docs for details.
- cookiecutter-hypermodern-python (this one is a bit much for me but is an amazing reference for modern best practices in python repo maintenance)