A tool to automatically create a Python project structure ready to release via GitHub and PyPI. It will also set up:
- A blank
README.md
file - A
LICENSE
file - Pre-commit hooks to automate linting checks and formatting
- Automatic versioning using setuptools_scm
- A structure for automated tests using pytest
- Automated formatting checks, testing and release using GitHub actions
- Documentation using Sphinx
Based on cookiecutter-napari-plugin
- Table of contents
- Set up
- Add your modules and tests
- Before committing your changes
- Versioning
- GitHub actions workflow
- Documentation
First, install cookiecutter in your desired environment. Running in the terminal in your environment, with Pip:
pip install cookiecutter
or conda:
conda install -c conda-forge cookiecutter
In the folder, you want to create the repo run:
cookiecutter https://github.com/neuroinformatics-unit/python-cookiecutter
You will be then asked a series of questions about how you want to set up your project.
For each one, type your answer, enter a single number (or just hit return) to choose from a default option.
full_name [Python developer]:
- e.g.Adam Tyson
email [[email protected]]:
- e.g.[email protected]
github_username_or_organization [githubuser]:
- e.g.adamltyson
package_name [python-package]:
- e.g.my-awesome-software
Select github_repository_url:
- Default will be e.g.https://github.com/adamltyson/my-awesome-software
, but you can also provide this later.module_name [my_awesome_software]:
- The default will be the same aspackage_name
but with hyphens converted to underscores.short_description [A simple Python package]:
- Enter a simple, one-line description of your Python package.Select license:
- choose from:1 - BSD-3
2 - MIT
3 - Mozilla Public License 2.0
4 - Apache Software License 2.0
5 - GNU LGPL v3.0
6 - GNU GPL v3.0
Select create_docs:
- Whether to generate documentation using Sphinx, choose from:1 - Yes
2 - No
This is the structure cookiecutter will create:
└── my-awesome-software/
├── LICENSE
├── MANIFEST.in
├── README.md
├── pyproject.toml
├── tox.ini
├── my_awesome_software/
│ └── __init__.py
└── tests/
├── __init__.py
├── test_integration/
│ └── __init__.py
└── test_unit/
├── __init__.py
└── test_placeholder.py
A project with this information will then be written to the current working directory.
If you respond positively to Select create_docs:
, an additional docs
folder will be created and two example Python modules (math.py
and greetings.py
) will be added to the above structure.
└── my-awesome-software/
└── docs/
├── make.bat
├── Makefile
├── requirements.txt
└── source/
├── api_index.rst
├── conf.py
├── getting_started.md
└── index.rst
└── my_awesome_software/
├── __init__.py
├── greetings.py
└── math.py
Although it asks for a GitHub username or organization and package name, it does not initialize a git repository.
To do so navigate to your project folder:
cd my-awesome-software
and run:
git init -b main
N.B. If you have an older version of Git (<v2.28), this will produce an error and you will need to run the following:
git init
git checkout -b main
Then stage and commit your changes:
git add .
git commit -m "Initial commit"
On GitHub create a new empty repository, and locally add the remote origin and push:
git remote add origin [email protected]:adamltyson/my-awesome-software.git
git push
Your methods and classes would live inside the folder my_awesome_software
. Split the functionalities into modules, and save them as .py
files, e.g.:
my_awesome_software
├── __init__.py
├── greetings.py
└── math.py
If you want to import methods and classes from one module to another you can use the dot:
# filename: greetings.py
from .math import subtract_two_integers
If you want to import all the modules when installing you can add the following to your __init__.py
:
from . import *
To ensure any dependencies are installed at the same time as installing
your package, add them to your pyproject.toml
file. E.g. to add numpy
and pandas
as dependencies, add them to the dependencies = []
list under
the [project]
heading:
dependencies = ["numpy", "pandas"]
Write your test methods and classes in the test
folder. We are using pytest.
In your test module you can call your methods in a simple way:
# filename: test_math.py
from my_awesome_software import math
# here your test function
If you're testing a small piece of code, make it a unit test. If you want to test whether two or more software units work well together, create an integration test.
Be sure that you have installed pytest and run it
pip install pytest
pytest
You should also see coverage information.
For a local, editable install, in the project directory, run:
pip install -e .
For a local, editable install with all the development tools (e.g. testing, formatting etc.) run:
pip install -e '.[dev]'
You might want to install your package in an ad hoc environment.
To test if the installation works, try to call your modules with python in another folder from the same environment.
from my_awesome_sofware.math import add_two_integers
add_two_integers(1, 2)
Running pre-commit install
will set up pre-commit hooks to ensure the code is
formatted correctly. Currently, these are:
- ruff does a number of jobs, including linting, auto-formatting code (with
ruff-format
), and sorting import statements. - mypy a static type checker
- check-manifest to ensure that the right files are included in the pip package.
- codespell to check for common misspellings.
These will prevent code from being committed if any of these hooks fail. To run them individually:
ruff check --fix # Lint all files in the current directory, and fix any fixable errors.
ruff format # Format all files in the current directory.
mypy -p my_awesome_software
check-manifest
codespell
You can also execute all the hooks using pre-commit run
. The best time to run this is after you have staged your changes, but before you commit them.
In the case you see mypy
failing with an error like Library stubs not installed for this-package
, you do have to edit the .pre-commit-config.yaml
file by adding the additional dependency to mypy
:
- id: mypy
additional_dependencies:
- types-setuptools
- types-this-package
We recommend the use of semantic versioning, which uses a MAJOR
.MINOR
.PATCH
versiong number where these mean:
- PATCH = small bugfix
- MINOR = new feature
- MAJOR = breaking change
setuptools_scm
can be used to automatically version your package. It has been pre-configured in the pyproject.toml
file. setuptools_scm
will automatically infer the version using git. To manually set a new semantic version, create a tag and make sure the tag is pushed to GitHub. Make sure you commit any changes you wish to be included in this version. E.g. to bump the version to 1.0.0
:
git add .
git commit -m "Add new changes"
git tag -a v1.0.0 -m "Bump to version 1.0.0"
git push --follow-tags
N.B. It is also possible to perform this step by using the GitHub web interface or CLI.
A GitHub actions workflow (.github/workflows/test_and_deploy.yml
) has been set up to run (on each commit/PR):
- Linting checks (pre-commit).
- Testing (only if linting checks pass)
- Release to PyPI (only if a git tag is present and if tests pass). Requires
TWINE_API_KEY
from PyPI to be set in repository secrets.
Software documentation is important for effectively communicating how to use the software to others as well as to your future self.
If you want to include documentation in your package, make sure to respond with 1 - Yes
when prompted during the cookiecutter
setup. This will instantiate a docs
folder with a skeleton documentation system, that you can build upon.
The documentation source files are located in the docs/source
folder and should be written in either reStructuredText or markdown. The index.rst
file corresponds to the main page of the documentation website. Other .rst
or .md
files can be included in the main page via the toctree
directive.
The documentation is built using Sphinx and the PyData Sphinx Theme. The docs/source/conf.py
file contains the Sphinx
configuration.
You can build and view the documentation website locally, on your machine. To do so, run the following commands from the root of your project:
# Install the documentation build dependencies
pip install -r docs/requirements.txt
# Build the documentation
sphinx-build docs/source docs/build
This should create a docs/build
folder. You can view the local build by opening docs/build/index.html
in a browser.
To refresh the documentation, after making changes, remove the docs/build
folder and re-run the above command:
rm -rf docs/build
sphinx-build docs/source docs/build
We have included an extra GitHub actions workflow in .github/workflows/docs_build_and_deploy.yml
that will build the documentation and deploy it to GitHub pages.
- The build step is triggered every time a pull request is opened or a push is made to the
main
branch. This way you can make sure that the documentation does not break before merging your changes. - The deployment is triggered only when a tag is present (see Automated versioning). This ensures that new documentation versions are published in tandem with the release of a new package version on PyPI (see GitHub actions workflow).
- The published docs are by default hosted at
https://<github_username_or_organization>.github.io/<package_name>/
. To enable hosting, you will need to go to the settings of your repository, and under the "Pages" section, select thegh-pages
branch as the source for your GitHub pages site. - A popular alternative to GitHub pages for hosting the documentation is Read the Docs. To enable hosting on Read the Docs, you will need to create an account on the website and follow the instructions to link your GitHub repository to your Read the Docs account.
The journey towards good documentation starts with writing docstrings for all functions in your module code. In the example math.py
and greetings.py
modules you will find some docstrings that you can use as a template. We have written the example docstrings following the numpy style but you may also choose another widely used style, such as the Google style.
Once you have written docstrings for all your functions, API documentation can be automatically generated via the Sphinx autodoc extension. We have given examples of how to do this in the docs/source/api_index.rst
file.