GitHub’s own CI called GitHub Actions has been out of closed beta for a while and offers generous free quotas and a seamless integration with the rest of the site. Let’s have a look on how to use it for an open source Python package.

This is the followup to my article on Python in Azure Pipelines and while Azure Pipelines are more advanced, my current recommendation for most people is to switch to GitHub Actions for its simplicity and better integration.

Assumptions

This is not meant to be exhaustive documentation on using GitHub Actions for Python packages. It focuses on a certain – and quite common – setup and is the guide I wished I had when I started moving my projects myself. It works best if the following is true:

  • You’re following along using the repository that will run the CI – not a private fork.
  • You’re using tox to manage your test environment and would like to re-use them in CI1.
  • You only need the Python versions that are provided by GitHub Actions2.

If some of those assumptions aren’t true, you’ll still find value in this article, but you’ll have to improvise or leave things out.

Running Your Tests

To get started, click the “Actions” tab in your GitHub project – it’s right next to “Pull requests”. All workflows presented as examples use a different development model than I like, so we’ll start with “Skip this: Set up a workflow yourself”.

You get thrown into a text field with the workflow YAML. Delete their template and paste the following instead:

---
name: CI

on:
  push:
    branches: ["master"]
  pull_request:
    branches: ["master"]

jobs:
  tests:
    name: "Python ${{ matrix.python-version }}"
    runs-on: "ubuntu-latest"
    env:
      USING_COVERAGE: '3.6,3.8'

    strategy:
      matrix:
        python-version: ["3.6", "3.7", "3.8", "pypy3"]

    steps:
      - uses: "actions/checkout@v2"
      - uses: "actions/setup-python@v1"
        with:
          python-version: "${{ matrix.python-version }}"
      - name: "Install dependencies"
        run: |
          set -xe
          python -VV
          python -m site
          python -m pip install --upgrade pip setuptools wheel
          python -m pip install --upgrade coverage[toml] virtualenv tox tox-gh-actions

      - name: "Run tox targets for ${{ matrix.python-version }}"
        run: "python -m tox"

Don’t forget to adapt the list of python-version to your own requirements, but do not add tox environments anywhere just yet.

I will explain the environment variable USING_COVERAGE in the section on measuring and reporting coverage and you can ignore it for now (or even delete it, if you don’t care about coverage at all).


Click the green “Start commit” button in the top right and make sure you select the “Create a new branch for this commit and start a pull request.” radio button.

Give the branch a memorable name (e.g. github-actions) and subsequently click “Create pull request”. We’ll need to do more changes and it’s nice to have a branch and a pull request that will try to run the tests against your changes. And we’ll do all our remaining changes in GitHub’s web interface too!

For that switch to the “Code” tab on the far left, click the “Branch: master” button, and select your newly created branch.

You may have already noticed that we installed the tox plugin tox-gh-actions. It’s a very nice helper that allows you to run tox in GitHub Actions without specifying what environments to run, based on the version that is running tox itself. To make it work, you have to configure it; so open your tox.ini and add a block like this:

[gh-actions]
python =
    3.6: py36
    3.7: py37, docs
    3.8: py38, lint, manifest
    pypy3: pypy3

It maps GitHub Actions Python version numbers to tox environment versions. A few remarks:

  1. You don’t have to specify all permutations of an environment. tox-gh-actions will run py38 as well as py38-foo, and py38-bar if they exist. This distinguishes it from tox’s tox -e py feature, that would only run py38.
  2. Make sure the Python versions in [gh-actions] match those in the environments (and, if you run pre-commit from tox: those in .pre-commit-config.yaml). They’ll fail otherwise due to missing interpreters.
  3. Since all tests for a Python single version run sequentially in one container, you lose some benefits of concurrency. In this case, the Python 3.8 environment takes the longest by far because it also runs all the linting. I personally find the simplicity worth this tradeoff but everyone has to decide for themselves.

At this point, your GitHub Actions CI should pass. You should double check that all Python versions run all the tox environments that you expect them to run.

Cleanup

Now that you’re running your tests in GitHub Actions, it’s time to clean up.

  • Remove your old CI’s configuration (e.g. .travis-ci.yml or azure-pipelines.yml) from your repository.
    • Possibly delete the projects from their sites.
  • Replace your CI badge in the README. For reST, it should look something like this:

    .. image:: https://github.com/YOU/YOUR-PROJECT/workflows/CI/badge.svg?branch=master
       :target: https://github.com/YOU/YOUR-PROJECT/actions?workflow=CI
       :alt: CI Status
    
  • If you’re using branch protection, remove the old CI from Settings → Branches → Branch protection rules → master → Edit and mark the new GitHub Actions checks as required.

  • There may be references to your old CI system throughout your documentation. It’s best to grep your repository for its name and hostname.

Summary

Congratulations, your project is running in GitHub Actions! With very little work, we’ve got a well-integrated, decently fast CI solution!

If you’d like to see a complete Azure Pipelines to GitHub Actions transition, check out this structlog commit.

You will notice that there’s more stuff going on and if you’re interested in that, feel free to dive into the following bonus topics!

Coverage

Technically, all you need to upload your coverage data to Codecov is adding the following two steps:

      - name: "Convert coverage"
        run: "python -m coverage xml"
      - name: "Upload coverage to Codecov"
        uses: "codecov/codecov-action@v1"
        with:
          fail_ci_if_error: true

First convert coverage’s data to XML3, then run the official Codecov action to upload your coverage to Codecov without storing the project’s secret Codecov key in GitHub.

And maybe that’s good enough for you! In which case you can remove the whole env: block that sets USING_COVERAGE and skip the rest of this section.


My life however is never this simple so my coverage setup is a bit more involved:

Firstly, I measure coverage in parallel mode which means that I have to run coverage combine before I can convert it to XML.

More importantly: I don’t run all my environments using coverage, because it has a noticeable performance impact on larger test suites – especially on PyPy. So I only pick those environments that I need to cover all my code – commonly 3.8 and 2.7.

But since coverage combine and coverage xml would fail if run in a directory without coverage data, I have to make their runs conditional. For that I create the environment variable USING_COVERAGE that you have seen before and add all Python versions that do run using coverage (i.e. coverage run or pytest-cov) separated by commas4.

Now I check using the contains() function whether the currently active Python version (matrix.python-version) is in the string. If yes, it combines the coverage, converts it into an XML file, and uploads it to Codecov:

      - name: "Combine coverage"
        run: |
          set -xe
          python -m coverage combine
          python -m coverage xml
        if: "contains(env.USING_COVERAGE, matrix.python-version)"
      - name: "Upload coverage to Codecov"
        if: "contains(env.USING_COVERAGE, matrix.python-version)"
        uses: "codecov/codecov-action@v1"
        with:
          fail_ci_if_error: true

I have to protect the Codecov action with another if condition so I can use fail_ci_if_error: true. This is essential because Codecov has grown unreliable over the years5 and if an upload fails, I want that job to fail too – instead of Codecov failing a pull request due to ostensibly missing coverage.


Unfortunately, this doesn’t cover one failure scenario of Codecov’s flakiness: sometimes it succeeds but fails to update the GitHub check. That means: all your builds pass, if you go to Codecov’s website it reports 100%, but the Codecov check is still failing with some amount of missing percentage.

And sadly, you cannot restart a GitHub Actions workflow if it succeeded, thus there is no easy way to get out of this situation. The simplest one is to send an empty commit to trigger a new build.

But long term we should probably look for ways to stop relying on services for coverage reporting. The only serious competitor Coveralls doesn’t support Python coverage data in their official action and if you use the PyPI package by hand, GitHub Actions won’t inject the necessary secret token into the build if run from a fork (ie. a pull request by a contributor) – rendering it pointless.

Ensure You Can Build Your Package

I personally prefer to release my packages using local automation but it’s still nice to continuously make sure that your package builds and your PyPI long description renders properly.

Since I use setuptools for packaging and isolated builds, my job looks like this:

  package:
    name: "Build & verify package"
    runs-on: "ubuntu-latest"

    steps:
      - uses: "actions/checkout@v2"
      - uses: "actions/setup-python@v1"
        with:
          python-version: "3.8"

      - name: "Install pep517 and twine"
        run: "python -m pip install pep517 twine"
      - name: "Build package"
        run: "python -m pep517.build --source --binary ."
      - name: "List result"
        run: "ls -l dist"
      - name: "Check long_description"
        run: "python -m twine check dist/*"

This doesn’t actually do anything with the package except checking it using twine, but that’s something you can change easily.

Ensure Your Dev Environment Works Everywhere

I do all my development on macOS and my packages are usually platform agnostic – however, I have managed to break the development environment on Windows before. Thus to prevent that from happening again in the future, I have the following job that checks whether the package can be installed in development mode and subsequently be imported on all three major platforms:

  install-dev:
    strategy:
      matrix:
        os: ["ubuntu-latest", "windows-latest", "macos-latest"]

    name: "Verify dev env"
    runs-on: "${{ matrix.os }}"

    steps:
      - uses: "actions/checkout@v2"
      - uses: "actions/setup-python@v1"
        with:
          python-version: "3.8"
      - name: "Install in dev mode"
        run: "python -m pip install -e .[dev]"
      - name: "Import package"
        run: "python -c 'import structlog; print(structlog.__version__)'"

It assumes that your package has an extra dependency called dev that my projects have for installing all packages that you need to work on them locally – including building their documentation. If you’d like to know more about how I run my FOSS projects, I gave a talk on it in 2019.

Please note that the last line is the only line containing the project name – in this case structlog. The rest of your workflow file is easily copy-and-paste-able.

Further Reading

Footnotes


  1. Here’s a rather simple example of how I use tox. ↩︎
  2. You can of course use the deadsnakes PPA on their Linux runners since they are based on Ubuntu. ↩︎
  3. You may have been using the pip installable Codecov that can handle a .coverage file directly. That’s not true for for the Coverage action which requires an XML coverage file. ↩︎
  4. Please note that USING_COVERAGE is a flat string and me using a comma-separated list is just my own convention. ↩︎
  5. My by far biggest GitHub Actions feature request at the moment is to be able to rerun only certain jobs – ideally certain matrix combination. The chance that another Codecov upload will fail on the second build is sadly not zero. ↩︎