Note The objectives of this section
- Get an overview of what you can do with continuous integration.
- First use case with github-actions.
- Automate the publication of new packages on PyPI and conda during a release.
- Automatically publishing documentation on
Read the Docs
or GitHub/Lab pages.
Package distribution
GitHub
Automation was then carried out using the continuous integration tools available on github: travis, appveyor, circle-ci, ... While these tools are still available, github-actions has greatly changed the way you do continuous integration. It is more flexible, thanks to the different actions that can be found in its marketplace. It makes it easy to execute actions for all the processes managed by github: issue opening, pull request, release, ... Released at the end of 2019, it quickly became number one in usage, eclipsing travis.
If you don't already have an account on github, create one. You'll then need to create a new project that you can name splinart
, for example.
Copy the final_step
of packaging in your repository.
Now that your repository has been created, we'll move on to the following steps (in order)
- Set up a first github-actions workflow for continuous deployment.
- create the environments
testpypi
andanaconda
on your repository. - Add a trust publisher on test PyPI
- Create a new release on the repository => Deployment of
splinart
on PyPI and conda.
Workflow are triggered from what is inside the .github/workflows
directory.
We now need to create the first workflow by adding the file .github/workflows/cd.yml
whose content is
name: publish
on:
release:
types: [published]
jobs:
build:
name: Make SDist and weel
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install build
run: pip install build
- name: Building
run: python -m build
- uses: actions/upload-artifact@v3
with:
path: dist/*
upload_on_pypi:
needs: [build]
runs-on: ubuntu-latest
environment: testpypi
permissions:
id-token: write
steps:
- uses: actions/download-artifact@v3
with:
name: artifact
path: dist
- uses: pypa/gh-action-pypi-publish@release/v1
with:
repository-url: https://test.pypi.org/legacy/
# password: ${{ secrets.TEST_PYPI_TOKEN }} # If you use a token
upload_on_conda:
needs: [build]
runs-on: ubuntu-latest
environment: anaconda
steps:
- uses: actions/checkout@v1
- uses: mamba-org/setup-micromamba@v1
with:
environment-name: build-env
create-args: >-
python=3.10
conda-build
anaconda-client
- name: Build the recipe
shell: bash -l {0}
run: conda build recipes
- name: upload on conda
shell: bash -l {0}
run: anaconda -t ${{ secrets.ANACONDA_TOKEN }} upload --force /home/runner/micromamba/envs/build-env/conda-bld/*/splinart-*.tar.bz2
Note
We can see that this action is only triggered at the time of a release
on: release: types: [published]
We start by creating the archive and wheels with the job
build
.If all goes well, we deploy to PyPI and to conda, which is why these two jobs have
needs: [build]
.Deployment on Test PyPI is tokenless, as we've added our package's github site as a trusted publisher.
Deployment on conda is done using a token that must be added to the secrets of your repository.
We'll start by creating a token on anaconda. Go to your anaconda account in the settings->access
section and create a token. The token should have the following rights
-
api:read
(Allow read access to the API site) -
api:write
(Allow write access to the API site)
You can now add it to github's secrets by going, at repository level, to settings/secrets and variables/actions
and adding your token to the repository secrets
section.
The name of the token must match that of the action. Here: ANACONDA_TOKEN
.
For the PyPI part, you can do the same (add a token) by going to Api token or say that your github repository is a trusted repository (trusted publisher).
Once you've set all this up and the action is in the main
branch of the repository, you can test it by creating a new release.
GitLab
On GitLab the CI/CD is specified by the .gitlab-ci.yml
file.
Here an example of packaging and deploying on tesPyPI and anaconda.
workflow:
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
- if: $CI_COMMIT_BRANCH && $CI_OPEN_MERGE_REQUESTS
when: never
- if: $CI_COMMIT_BRANCH
stages:
- build
- deploy
build_package:
stage: build
image: python:3.10
before_script:
- pip install build twine
script:
- python -m build
after_script:
- twine check dist/*
artifacts:
paths:
- dist
publish_testPyPI:
stage: deploy
image: python:3.10
environment: testpypi
before_script:
- pip install twine
script:
- twine upload -u __token__ -p $TEST_PYPI_TOKEN -r testpypi dist/*
publish_anaconda:
stage: deploy
image: mambaorg/micromamba:latest
environment: anaconda
before_script:
- mamba create -n build-env python=3.10 conda-build anaconda-client
- mamba activate build-env
script:
- conda build recipes
- anaconda -t $ANACONDA_TOKEN upload --force /home/runner/micromamba/envs/build-env/conda-bld/*/splinart-*.tar.bz2
Automatic versioning
Every time you want to set up a new release you need to
- Change version in
splinart/version.py
. - Change version in
recipes/meta.yaml
. - Push changes.
- Create a new release.
Human is not good at repeating task and tends to forget steps, but computer are not so let's delegate it. Let's do some changes to the recipe.
{% set pyproject = load_file_data('pyproject.toml') %}
{% set name = pyproject.get('name') %}
{% set version = pyproject.get('version') %}
package:
name: {{ name }}
version: {{ version }}
...
Now the recipe is linked to the pyproject.toml file. If we change the name or the version of the project in the project.toml
it will be updated in the recipe too.
Now let use setuptools_scm
(in the project.toml) to guess the version of the project from git tags to not have to manually set it.
We need to have a dynamic version and add setuptools_scm
as a build requirement.
[build-system]
requires = ["setuptools>=64", "setuptools_scm>=8"]
build-backend = "setuptools.build_meta"
[project]
name = "splinart"
dynamic = ["version"]
...
[tool.setuptools_scm]
We also need to update the recipe's build requirement.
{% set pyproject = load_file_data('pyproject.toml') %}
{% set name = pyproject.get('name') %}
{% set version = pyproject.get('version') %}
package:
name: {{ name }}
version: {{ version }}
...
requirements:
build:
- setuptools_scm
...
Now every time we create a release, the tag we choose is the version used for the packaging. We don't have to change it manually anymore.
Documentation deployment
ReadTheDocs
We're now going to look at how to generate documentation automatically on ReadTheDocs. To do this, we'll add an requirements.txt
file to the docs
directory, indicating everything we need for sphinx and its dependencies. And a .readthedocs.yml
file in the project root indicating how to install our project.
Here's what these two files look like
sphinx
nbsphinx
pydata-sphinx-theme
numpydoc
nbsphinx-link
myst-parser
version: 2
build:
os: ubuntu-22.04
tools:
python: "3.10"
# You can also specify other tool versions:
# nodejs: "16"
# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: docs/conf.py
# configuration: docs/source/conf.py
# Dependencies required to build your docs
python:
install:
- requirements: docs/requirements.txt
You now need to create an account at https://readthedocs.org/ and connect your github project to it. This way, every time you update the repository, the documentation will be regenerated. See more information on https://docs.readthedocs.io/en/stable/integrations.html.
GitHub pages
To use a custom build process or a static site generator other than Jekyll, you have write a custom workflow to build and publish your site.
To activate go to Settings -> Code and automation -> Pages
and inside the Build and deployment
section
choose GitHub Actions
as the source.
Here an example of workflow to deploy the documentation generated with sphinx to github pages.
# Simple workflow for deploying static content to GitHub Pages
name: Deploy static content to Pages
on:
# Runs on pushes targeting the default branch
push:
branches:
- main
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages
permissions:
contents: read
pages: write
id-token: write
# Allow only one concurrent deployment, skipping runs queued between the run in-progress and latest queued.
# However, do NOT cancel in-progress runs as we want to allow these production deployments to complete.
concurrency:
group: "pages"
cancel-in-progress: false
jobs:
# Single deploy job since we're just deploying
deploy:
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
with: # this is needed if you need to fetch tags (setuptools_scm)
fetch-depth: 0
fetch-tags: true
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: '3.10'
- name: Install your project
run: pip install .
- name: Setup documentation dependencies
run: pip install -r docs/requirements.txt
# The 2 last steps can be merge into one if you have an optional dependency
# for the doc and run pip install .[doc]
- name: Build documentation
run: sphinx-build docs/source html
- name: Clean up .doctrees
run: rm -rf `find html -name .doctrees -type d`
# Github pages
- name: Setup Pages
uses: actions/configure-pages@v3
- name: Upload artifact
uses: actions/upload-pages-artifact@v2
with:
path: 'html'
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v2
GitLab pages
To deploy on gitlab you need to add a job called pages
and to build your documentation
inside the public
folder because GitLab Pages only considers files in a directory called public.
workflow:
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
- if: $CI_COMMIT_BRANCH && $CI_OPEN_MERGE_REQUESTS
when: never
- if: $CI_COMMIT_BRANCH
...
pages:
stage: deploy
environment: production
image: python:3.10
before_script:
- pip install -r docs/requirements.txt # doc dependencies
- pip install . # install your project
script:
- sphinx-build docs/source public
artifacts:
paths:
- public
rules:
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
Exercices
The pratical_session/TPs/3.cd/step0
directory takes up the last step of the previous TP.
Now lets try to integrate the deployment with CD.
Package distribution
- choose if you want to use github or gitlab
- add environment to deploy in a github/lab
- generate api token or add github as trusted publisher at test PyPI
- add the workflow/pipeline files and push it
- do a release
Documentation distribution
- add a
requirements.txt
file to the folder for the documentation - choose if you want to use readthedocs or pages
- add the workflow (github pages) or the job (gitlab pages) or .readthedocs.yml file (readthedoc).
- push your changes
- connect your github account in readthedocs if you deploy on it