One of my first TIL entries was about how you can imitate Node’s node_modules
semantics in Python on UNIX-like operating systems. A lot has happened since then (to the better!) and it’s time for an update. direnv still rocks, though.
One major thing that happened to Python is uv (have you seen my videos about it? Is uv the Future of Python Packaging? and uv is the Future of Python Packaging! offer a gripping story arc over 6 months!) and another thing that “happened” to me is an ARM-based computer and my need to run Python both in Intel and ARM mode which somewhat complicated my life (see also How to Automatically Switch to Rosetta With Fish and Direnv).
I have also embraced the emerging standard of putting a project’s virtual environment into an in-project directory called .venv
. I liked having a centralized place with virtualfish better, but that’s not a hill I’m willing to die on. The tools overall work better this way (for example, VS Code doesn’t need any help finding it) and the JavaScript community has somehow survived with their in-project node_modules
, too.
direnv
What hasn’t changed is my enthusiasm for direnv. Broadly speaking, direnv will execute a file called .envrc
when you enter a directory. But it’s smart enough to only execute it when necessary (entering the directory, changes to .envrc
) and to undo the changes once you leave the directory.
It’s a much more powerful version of the venerable .env
files (that it can load, too) and I really like it as glue between non-overbearing tools. Even if you choose to not use the same tools as I do, chances are direnv can improve your life.
With Astral releasing uv and taking over Rye, I think it’s fair to say that Python packaging and project workflow tools have entered an exciting, transitional period. So, it’s unclear how well this post will age. But in the worst case, it will be a historical record once we’re all riding unicorns through the sky.
Python installations
Nowadays, I get as many binary Python installers directly from https://www.python.org/downloads/ as I can, because they’re universal2
builds with a special command that always runs in Intel mode (for example, python3.12-intel64
) and that makes creating Intel-based virtual environments less error-prone. As Glyph said: Get Your Mac Python From Python.org.
I fill in the missing ones – that I need for my open-source projects – using one of the installers for Gregory Szorc’s python-build-standalone. They’re a tad too quirky for production use, but they’re wonderful for driving tox and Nox.
On Linux, we use deadsnakes for everything.
I use different mechanisms to manage the virtual environments depending on whether I’m developing a package1 without strict pinning or an application with strict pinning.
Unpinned package
I’m aware that it makes sense to pin your development and test dependencies for a stable CI, too.
However, I don’t have a project with enough churn and dependency-caused breakage to avoid the sad look of a commit history consisting of 99% Dependabot updates.
To me, this is the pragmatic trade-off where I just pin or fix when I encounter problems.
While direnv has built-in support for the standard library’s venv
module, it’s glacial. virtualenv is much faster, but uv is faster, still.
Unlike applications, my packages usually support more than one Python version. To have one canonical place to store the currently default development version of a package, I’ve started adding a .python-version-default
2 file to my projects that contains the version I want to use.
Then, I add the following to the project’s .envrc
:
test -d .venv || uv venv --python $(cat .python-version-default)
source .venv/bin/activate
Now direnv ensures the virtual environment exists and is active whenever I enter the directory: test -d .venv
checks if .venv
is an existing directory and if not, it creates it using uv venv
for whatever the version in .python-version-default
is.
The nice thing is that I can use the same file in GitHub Actions as an input to setup-python:
- uses: actions/setup-python@v5
with:
python-version-file: .python-version-default
Bonus tip
Here’s a Fish shell function to recreate the virtual environment if needed:
function ,repypkg
rm -rf .tox .nox .venv
# Ensure .venv exists
direnv exec . python -Im site
uv pip install --editable .[dev]
end
Given how fast uv is, this is also the best way to update all dependencies. For structlog, it takes about 600ms with a hot cache.
Sidenote: I use direnv exec
instead of direnv reload
because the latter seems to run asynchronously and uv pip install
fails with No such file or directory (os error 2)
.
Pinned application
Due to developing on an ARM Mac for Intel Linux, I need cross-platform lock files, which rules out the otherwise excellent pip-tools, as well as pip-tools-adjacent tools like Rye and older versions of uv3.
From all the alternatives, I’ve found uv in project mode and PDM to work well.
uv’s high-level features have virtualenv support baked in and look for the correct Python version in pyproject.toml
’s project.requires-python
. But Astral doesn’t believe in activating them. Therefore, my .envrc
looks like this:
uv sync
source .venv/bin/activate
uv sync
makes sure .venv
exists with the correct Python version – source
makes sure I can run my CLI scripts from the project without a uv run
prefix.
If you’re interested in building Docker containers for uv applications, check out Production-ready Python Docker Containers with uv.
Bonus tip: DRY Python version detection
With PDM, we have to detect the version ourselves.
We use GitLab CI to build our Python Docker containers, and I extract the correct Python version by configuring requires-python
in the project’s pyproject.toml
:
[project]
name = "some-app"
requires-python = "~=3.12.0"
This way, I can extract that string in the CI configuration in .gitlab-ci.yml
and pass it as a build argument to docker build
:
# ...
build:
stage: build
only: [main]
script:
- export PY=$(sed -nE 's/^requires-python = "~=(3\.[0-9]+)\.0"$/python\1/p' pyproject.toml)
# PY is something like `python3.12` now
- >
docker build
--build-arg PY=$PY
# ...
Then, in the Dockerfile
, use the extracted version to create a virtual environment in the build stage and to install the Python version in the application stage:
FROM your-docker/build-image as build
ARG PY
RUN --mount=type=cache,target=/root/.cache \
set -ex \
&& virtualenv --python $PY /app
# ...
FROM your-docker/app-image
# ...
RUN set -ex \
&& apt-get update -qy \
&& apt-get install -qyy \
-o APT::Install-Recommends=false \
-o APT::Install-Suggests=false \
$PY
# ...
COPY --from=build --chown=app /app /app
# ...
I could do the same operation in my .envrc
, but why not extract the command line from .gitlab-ci.yml
and eval it instead?
eval "$(sed -nE 's/^.*- (export PY=.*)/\1/p' .gitlab-ci.yml)"
Now, I have a shell variable PY
with a version like python3.12
based on metadata from pyproject.toml
with no duplication. As a bonus, it also verifies that the version extraction in CI actually works.
Locally, I use that variable when I want to recreate the project’s virtual environment:
rm -rf .venv
pdm venv create $(command -v $PY)
pdm use --ignore-remembered .venv/bin/python
pdm sync --dev
Check out TIL: which is not POSIX if you’re irritated by the command -v
.
Look ma, no duplication!
A package as in: A code library you install from PyPI. All my applications are technically packages, too, but that’s not what I mean here. ↩︎
It used to be
.python-version
for seamless pyenv integration, but some pyenv users complained that they wanted to keep control. You can still achieve the same usingln -s .python-version-default .python-version
since it’s the same format. ↩︎No, I don’t want to run
pip-compile
in a Docker container. ↩︎