Since the acquisition of Travis CI, the future of their free offering is unclear. Azure Pipelines has a generous free tier, but the examples I found are discouragingly complex and take advantage of features like templating that most projects don’t need. To close that gap, this article shows you how to move a Python project with simple CI needs from Travis CI to Azure Pipelines.

Dedicated to #travisAlumns – you’ve changed the world!

Update from 2020-03-11: While this article remains correct, I personally think that GitHub Actions are the better choice for Python packages nowadays so I’ve blogged about it.

To keep the post short, I’m going to make a few assumptions:

  1. You’re hosting your code on GitHub.
  2. You only need Linux builds.
  3. You’re using tox to ensure parity between CI and local development.

Failing one or more of those assumptions doesn’t make this post worthless to you, but you will have to do some extra leg work yourself. This post is mostly about getting your foot into the door without dumping a load of YAML on you that you can’t understand.

Why Azure Pipelines?

The main reason to go for Azure Pipelines may prove bad down the road but it is what it is: unlike other CI companies Microsoft is not interested in selling you continuous integration as a product. They want to pull you into their ecosystem (see also VS Code) therefore their free offering is incredible: you get 10 parallel builds on Linux, Windows, and macOS. And those builds are also very fast1 individually.

After using Azure Pipelines for a while, the downsides I could find so far are:

  1. Microsoft is a very enterprise company and never was it better illustrated than around the time this article went live: they broke most Python installations for whopping 24 days.

    The most frustrating part being that it took them four days to even forward the problem to the responsible product group while feeding have-you-tried-switching-it-on-and-off-again-level advice to people reporting it.

    2019-07-02 they published a post mortem – more than a month after the problems started.

  2. It’s really complex and things can be difficult to find.

  3. Build triggers can be slightly flaky sometimes.

  4. It’s impossible to rebuild a job. So when some flakiness fails a job of yours (could be just codecov), you have to start a complete rebuild.

  5. They don’t support unreleased in-development versions of Python interpreters out of the box.

  6. There is no trivial way to allow certain tests to fail like the allow_failures: matrix in Travis CI.

Point 5 might gets remedied soon, I have seen some preliminary work being merged. Until then, you can achieve the same using the deadsnakes PPA as I’ll show at the end of this article. Either way, it’s still highly problematic without the last point – especially as early in the cycle as e.g. 3.8 is as of June 2019 and everything seems to be broken.

Setting Up Your Account

Before we can get started you’ll need to do three things at

  1. Create an account.
  2. Create an organization. I just went for “the-hynek” because my usual “hynek” was occupied. If your project lives in an organization it’s a good idea to mirror it here too. Therefore my attrs is an organization too. However, there is absolutely no functional connection between GitHub organizations and Azure organizations.
  3. Create a project. Make sure you set “Visibility” to “Public” if it’s an open source project.

Those should be self-explanatory. Ideally go for “Start free with GitHub” to link your account right away.

Creating a Pipeline

To start building, we need to create a pipeline. For that go to “Pipelines” and click “New pipeline” in the big empty field on the right. Choose “GitHub”, now you should be presented a list of your GitHub repositories.

Pick the one you want to build/test in this pipeline and you will be redirected to GitHub, where you have to confirm that you want to give Azure Pipelines access to your repository.

Now you can pick a template for your pipeline. It doesn’t matter which one you pick, you’ll be thrown into an editor right away where you can edit your azure-pipelines.yml file, the analog of .travis.yml. You may want to add it to your before continuing now, to save yourself from packaging problems later.

Select everything in the editor and hit delete – we’re starting over.

There is a bunch of boilerplate you need to add but ultimately the main task and biggest busywork when moving from Travis CI is to transform the test matrix entries for your tox environments into Azure Pipelines’ format. So instead of

- python: "3.7"
  env: TOXENV=py37

you want something like

  python.version: '3.7'
  tox.env: py37

Please note that unlike Travis CI, Azure Pipelines’s matrix is a dictionary and not a list. So the build names must be unique and the easiest way to achieve that is to re-use your tox environment names that should be unique too.

And since this is YAML, it can be easily automated by a script. If your matrix follows the very common format from above, you can use the following script to easily transform your matrix from one format to the other.

Please note that it depends on aspy.yaml to retain the order of your environments on all Python versions:

from collections import OrderedDict
from textwrap import indent

import aspy.yaml

with open(".travis.yml") as f:
    travis = aspy.yaml.ordered_load(f)

matrix = OrderedDict()
for build in travis["matrix"]["include"]:
    tox_env = build.get("env", "nope")
    if not tox_env.startswith("TOXENV="):
    tox_env = tox_env.split("=", 1)[1]
    python = build["python"]
    # Not supported yet.
    if python.endswith("-dev"):
    if python == "pypy":
        python = "pypy2"

    matrix[tox_env] = OrderedDict(
        (("python.version", python), ("tox.env", tox_env))

print(indent(aspy.yaml.ordered_dump(matrix), " " * 8))

Just run it in the directory with the travis.yml that you’d like to convert. It returns a correctly indented block that you can paste straight into your azure-pipelines.yml.

If your Travis CI matrix is structured differently, you still may be able to adapt the script to your needs. 🤞

A complete and fully functional azure-pipelines.yml could look like this:

  - master

  - job: 'Test'
      vmImage: 'ubuntu-latest'
        # <-- SNIP BEGIN -->
          python.version: '2.7'
          tox.env: py27
          python.version: '3.7'
          tox.env: py37
          python.version: '3.7'
          tox.env: docs
        # <-- SNIP END -->

      - task: UsePythonVersion@0
        displayName: Get Python for Python tools.
          versionSpec: '3.7'
          addToPath: false
        name: pyTools

      - script: $(pyTools.pythonLocation)/bin/pip install --upgrade tox
        displayName: Install Python-based tools.

      - task: UsePythonVersion@0
          versionSpec: '$(python.version)'
          architecture: 'x64'
        displayName: Use cached Python $(python.version) for tests.

      - script: $(pyTools.pythonLocation)/bin/tox -e $(tox.env)
        displayName: run tox -e $(tox.env)

Just replace the example content between # <-- SNIP BEGIN --> and # <-- SNIP END --> with the result of the script.

By default, Azure Pipelines triggers builds for every commit on every branch, and for every pull request. To avoid having two builds per pull request, we use the trigger section to limit the branch trigger to commits on the master branch.

We only define one job that runs on the latest Ubuntu Linux image with the configurations that we’ll define in the strategy matrix.

And the steps for each entry in the strategy matrix are:

  1. Install an “isolated” Python 3.7 that is not added to the search PATH and that we’ll use for Python-based tools. This ensures that our build doesn’t start failing when running tests for Python versions that aren’t supported to run tox anymore (e.g. Python 2.7 in 2020).
  2. Install the latest version of tox using said Python. Since it’s not part of the search PATH, we have to use the path explicitly using variables.
  3. Install the Python version that is specified by the python.version variable in the matrix.
  4. Run tox – again from its absolute path – and pass the value stored in tox.env as the environment to use.

Press “Save and Run”, but before clicking it again in the pop up windows, choose “Create a new branch for this commit and start a pull request.” so you don’t break your master branch. Chances that everything works out on the first attempt are rather slim. #murphy

Clean Up GitHub

Now it’s time to clean up your GitHub project, which I like to do as part of the pull request – all from GitHub’s web UI: simply choose azure-pipelines in the “Branch” chooser and go wild. The web-based editor is good enough for the tasks we need to achieve.

First, thank your old .travis.yml and delete it. You may also want to replace the CI badge in your README. You can get a badge from Builds → three vertical dots → Status Badge but alas it currently doesn’t support reST badges. So you’ll have to build it yourself; copy “Image URL” and fill out the following snippet:

   :alt: CI Status

Now go to the “Settings” tab of your GitHub project (top right), click “Webhooks”, and click “Delete” on the Travis CI entry.

Still in “Settings”, go to “Branches”. If you don’t have anything there yet, click “Add rule” and enter master in “Branch name pattern”. Then click “Require status checks to pass before merging” and there should be a check named like “your-name.your-project” – e.g. hynek.structlog. That’s Azure Pipelines, so activate it. If you already had rules present, make sure that you also unselect continuous-integration/travis-ci. Finally click “Create” and then “Save changes”.

If the check appears in the pull request and is passing, you can merge the pull request now!

Currently (2019-06-03), after merging your azure-pipelines branch, you have to unstick master builds by queueing up one build manually using Builds → Queue.


Once your builds are passing, you can do some smaller things to make the experience nicer.

Clean Up the UI

You most likely don’t use all the features in the sidebar so why not disable them. For that click “Project settings” in the lower left corner, scroll a little down to “Azure DevOps services” and turn off everything you don’t need (most likely everything except “Pipelines”).


I’m only going to discuss how to use Codecov that you may have already used on Travis CI. If you’re interested in using Azure Pipelines’s internal coverage support, I recommend checking out azure-pipelines-template.

Unlike Travis CI, Circle CI, and AppVeyor, Azure Pipelines has no first class integration with Codecov yet (but I’ve been told it’s in the works). Therefore you have to manually use Codecov’s token as a secret variable2.

First, find the token in your project’s settings on the Codecov homepage. Then, back in Azure Pipelines, go into Pipelines → Builds and click “Edit” in the top right corner. Now click the three inconspicuous vertical dots in the top right corner and select “Variables”.

Click “Add” and add a variable called codecov.token with the token you’ve retrieved from Codecov earlier (looks like an UUID). Make sure to click the little lock symbol behind the token to make it a secret variable – it’s easy to miss.

At this point you would be able to access the secret variable from CI. But to enable your contributors to do the same in pull requests, you have to find the best hidden option in the history of computing: at the top left, you should see something resembling tabs with “YAML”, “Variables” (should be underlined, that’s where you are right now), and “Triggers”. Click “Triggers”.

On the left side, you’ll see a list with four sub-captions. The second should be “Pull request validation” and you should see your repo there. Click on your repo. On the right side, you should now see a sub-section called “Forks” with a checkbox called “Make secrets available to builds of forks” – check it3.

Now press “Save & queue”, then “Save & queue” in the popup menu again. Before you press “Save & queue” a third time, make sure to change the branch name from master to the name you chose earlier – the default being azure-pipelines.

Your build has a secret variable with the token available now, but we still have to use it. For that, we add a fifth step that is run for each entry in the matrix that will report the coverage to Codecov if – and only if – the build succeeded4:

      - script: |
          if [ ! -f .coverage.* ]; then
            echo No coverage data found.
            exit 0

          case "$(python.version)" in
          "pypy2") PY=pypy ;;
          "pypy3") PY=pypy3 ;;
          *) PY=python$(python.version) ;;

          # Python 3.8 needs an up-to-date pip.
          if [ "$(python.version)" = "3.8" ]; then
            curl -o
            $PY --user

          $PY -m pip install --user coverage codecov
          coverage combine
          CODECOV_TOKEN: $(codecov.token)
        displayName: Report Coverage
        condition: succeeded()

Since it’s problematic to combine/report coverage with a coverage installation that runs under a different Python version than the one that ran the tests, we go great lengths to ensure version parity.

Most of the complexity is caused by wanting to support deadsnakes installations where you can’t rely on python being the Python you used for your tests. Since the pip version on the build servers is too old for Python 3.8, we also have to install a fresh pip by hand – this step won’t be necessary in the future.

We map the secret variable to an environment variable as per Microsoft’s stern recommendation.

Run a build and double check the output of the “Report Coverage” step of a build (that generates coverage data!). It will not fail the build if it fails and it can be quite finicky! Also double check the project on Codecov, to ensure the data is correctly reported.

Add pytest Integration

Azure Pipelines supports the collection of a bunch of information under the “Tests” tab of the build detail view.

If you use pytest, all you have to do to fill it with data, is to install the package pytest-azurepipelines. To avoid unnecessary installation when it’s not running in Azure Pipelines, I use an extra dependency plus environment variable substitution in my tox.inis.

I already have a tests extra dependency that installs all packages that are needed for running my tests. You can use it by running pip install -e.[tests] or pass it as extras in your tox.ini.

Check out my service-identity project how this can look in practice: first I build a dictionary that maps extra names to lists of dependencies. Then, I pass said dictionary to the setuptools.setup() call.

Next you have to make sure that tox uses that extra, but only if running in Azure Pipelines. What I use here is a feature of tox that allows you to use the contents of an environment variable, falling back to a default value if it’s not set:

extras = {env:TOX_AP_TEST_EXTRAS:tests}

Therefore: if TOX_AP_TEST_EXTRAS is set, the contents of the variable is used as the extras marker. If not, tox installs the extra tests.

Finally we have to actually set that variable and we can do that by changing the “Run tox” step in your azure-pipelines.yml:

 - script: $(pyTools.pythonLocation)/bin/tox -e $(tox.env)
          TOX_AP_TEST_EXTRAS: azure-pipelines
        displayName: run tox -e $(tox.env)

How to Get Almost Any Python Version: deadsnakes

Since you’re running in a fully fledged Ubuntu container, you can use deadsnakes to install – for example – a development version of Python 3.8 by hand instead of using Azure Pipelines’s cached builds:

      - task: UsePythonVersion@0
          versionSpec: '$(python.version)'
          architecture: 'x64'
        condition: not(in(variables['python.version'], '3.8'))
        displayName: Use cached Python $(python.version) for tests.

      - script: |
          sudo add-apt-repository ppa:deadsnakes
          sudo apt-get update
          sudo apt-get install -y --no-install-recommends python$(python.version)-dev python$(python.version)-distutils          
        condition: in(variables['python.version'], '3.8')
        displayName: Install Python $(python.version) from the deadsnakes PPA for tests.

We protect the UsePythonVersion step using the condition that python.version must not be 3.8 and you add another step that installs it only if python.version is 3.8. Since we use in() instead of eq()/ne(), you can add more versions here – for example when one of the cached Python versions is broken. For that, add a comma after '3.8' and add the version(s) in both conditions.

Please note: Installing Python versions this way is significantly slower than using cached versions, so use it only when necessary.


As you can see, this article got quite long even though it covers only the most simple setup. I cannot stress enough how valuable Travis CI’s contribution to open source was, by making CI accessible to everyone.

I still hope it’s helpful to get started with a CI that is overwhelming. But on the other hand, you get to use the CI that Microsoft uses to build Visual Studio – for free.


Two people were invaluable for getting where I am: Anthony Shaw who supported me in realtime and Steve Dower who helped me over the finish line with attrs at the PyCon US 2019 sprints.

The concept of using a different Python installation for Python-based tools like tox is shamelessly stolen from azure-pipelines-template that is also very much worth a look.

Anthony Sottile was a great help to make Python 3.8-dev work.

There has been a lot of useful proof-reading feedback from (alphabetically): Iacopo Spalletti, Marc Garcia, Ravin Kumar, and Russell Keith-Magee. All errors are still entirely mine.

  1. One of my projects went from almost 24 minutes on Travis CI to less than 3 minutes. And at this point they don’t even have any kind of caching – can’t wait what that’ll do to the build times! ↩︎

  2. Please note that this token can leak despite being a “secret” variable. A malicious user can open a PR and send it anywhere they want. However, the token is worthless for anything except uploading coverage and it’s easy to see when someone does it. See also this article from Microsoft. ↩︎

  3. Please note that this token can leak despite being a “secret” variable. A malicious user can open a PR and send it anywhere they want. However, the token is worthless for anything except uploading coverage and it’s easy to see when someone does it. See also this article from Microsoft. ↩︎

  4. The coverage combine line is only necessary if you run coverage in --parallel mode. ↩︎