Sharing Your Labor of Love: PyPI Quick and Dirty

A completely incomplete guide to packaging a Python module and sharing it with the world on PyPI.


Few things give me caremads like Python modules I want to use that aren’t on PyPI1. On the other hand – as pydanny points out – packaging is rather confusing.

Therefore I want to help everyone who has some great code but feels lost with making it available to the broad public. I will be using my own project attrs as a realistic yet simple example of how to get a pure-Python 2 and 3 module packaged up, tested, and uploaded to PyPI. Including the binary wheel format that’s faster and allows for binary extensions!

I’ll keep it simple to get everyone started. At the end, I’ll link more complete documentation to show you the way ahead.

Tools Used

This is not a history lesson, therefore we will use:

$ pip install -U "pip>=1.4" "setuptools>=0.9" "wheel>=0.21" twine

Please make sure all installs succeed since ancient installations may need some extra manual labor.

A Minimal Glimpse Into The Past

Forget that there ever was distribute (cordially merged into setuptools), easy_install (part of setuptools, supplanted by pip), or distutils2 aka packaging (was supposed to be the official thing from Python 3.3 on, didn’t get done in time due to lack of helping hands, got ripped out by a heart-broken Éric and abandoned now).

Be just vaguely aware that there are distutils and distlib somewhere underneath but ideally it shouldn’t matter to you at all for now.

Nowadays, every project that you want to package needs a file that is executed whenever you build a distribution and – unless you install a wheel – on each installation.

For better or for worse, the Python community largely embraced cargo culture and everyone usually just copies it from one project to another and adopts some bits. Since the average consists only of metadata with some boilerplate code, it even makes sense2.

Let’s have a look at what attrs’s looks like:

import codecs
import os
import re

from setuptools import setup, find_packages


NAME = "attrs"
PACKAGES = find_packages(where="src")
META_PATH = os.path.join("src", "attr", "")
KEYWORDS = ["class", "attribute", "boilerplate"]
    "Development Status :: 5 - Production/Stable",
    "Intended Audience :: Developers",
    "Natural Language :: English",
    "License :: OSI Approved :: MIT License",
    "Operating System :: OS Independent",
    "Programming Language :: Python",
    "Programming Language :: Python :: 2",
    "Programming Language :: Python :: 2.7",
    "Programming Language :: Python :: 3",
    "Programming Language :: Python :: 3.3",
    "Programming Language :: Python :: 3.4",
    "Programming Language :: Python :: 3.5",
    "Programming Language :: Python :: Implementation :: CPython",
    "Programming Language :: Python :: Implementation :: PyPy",
    "Topic :: Software Development :: Libraries :: Python Modules",


HERE = os.path.abspath(os.path.dirname(__file__))

def read(*parts):
    Build an absolute path from *parts* and and return the contents of the
    resulting file.  Assume UTF-8 encoding.
    with, *parts), "rb", "utf-8") as f:


def find_meta(meta):
    Extract __*meta*__ from META_FILE.
    meta_match =
        r"^__{meta}__ = ['\"]([^'\"]*)['\"]".format(meta=meta),
        META_FILE, re.M
    if meta_match:
    raise RuntimeError("Unable to find __{meta}__ string.".format(meta=meta))

if __name__ == "__main__":
        package_dir={"": "src"},

As you can see, I’ve accepted that most of is boilerplate and put the metadata into a separate block (enclosed by #s, lines 8 thru 34).

I’ve also accepted that I’m but a fallible human and therefore all my metadata is saved in my files und extracted using regular expressions. Another approach is to put this data into a special module and parse that file using Python like PyCA’s cryptography does. Which approach you take is a matter of personal preference. Importing your actual is a bad idea though because you will run into dependency problems quickly.

As you can see, I’m putting my packages into an un-importable src directory. I wrote on the whys elsewhere. I strongly encourage you to follow suit.

There’s a lot of culture to cargo out there, but this is mine.

The metadata is mostly self-explaining but I’d like to stress one field: license. Always set a license! Otherwise, nobody can legally use your module. Which would be a pity, right?

The packages field uses a setuptools function to detect packages underneath src and the package_dir field explains where the root package is found.

You can also specify them by hand like I used to do for doc2dash’s However – unless you have good reasons – just use find_packages() and be done.

How to set and keep a project’s version is a matter of taste3 and different solutions. After using a few tools in the past, I’ve resorted to keep it post-fixed with .dev0 while in development (e.g. "15.2.0.dev0") and as part of a release I strip the suffix, push the package to PyPI, increment the version number and add the suffix again as part as the house keeping I do, whenever a new version cycle starts. That way in-development versions are easily discernible.

The classifiers field’s usefulness is openly disputed. Nevertheless pick them from here. PyPI will refuse to accept packages with unknown classifiers. Therefore I like to use "Private :: Do Not Upload" for private packages to protect myself from my own stupidity.

One icky thing are dependencies. Unless you really know what you’re doing, don’t pin them (specifying minimal version your package requires to work is fine of course) or your users won’t be able to install security updates of your dependencies.

Rule of thumb: requirements.txt should contain only ==, the rest (>=, !=, <=, …).

Non-Code Files

Every Python project has a


commit. Look it up, it’s true.

You have to add all files and directories that are not already packaged due to the packages keyword (or py_modules if your project is not a package) of your setup() call.

For attrs it’s

include *.rst *.txt LICENSE tox.ini .travis.yml docs/Makefile .coveragerc
recursive-include tests *.py
recursive-include docs *.rst
recursive-include docs *.py
prune docs/_build

For more commands, have a look at the docs. If you would like to avoid the aforementioned commit, start your projects with check-manifest in your CI that will also give you helpful hints on how to fix the errors it reports.

Important: If you want the files and directories from to also be installed (e.g. if it’s runtime-relevant data), you will have to set include_package_data=True in your setup() call.


For our minimal Python-only project, we’ll only need four lines in setup.cfg:

universal = 1

license_file = LICENSE

The first part will make wheel build a universal wheel file (e.g. attrs-15.1.0-py2.py3-none-any.whl) and you won’t have to circle through virtual environments of all supported Python versions to build them separately. The second part ensures that your LICENSE file is part of the wheel files which is a common license requirement.


As I’ve hopefully established, every open source project needs a license. No excuses.

Additionally, even the simplest package needs a README that tells potential users what they’re looking at. Make it reStructuredText (reST) so PyPI can properly render it on your project page. And as a courtesy to your users, also keep an easily discoverable changelog so they know what to expect from your releases. I like to put them into the project root directory and call them README.rst and CHANGELOG.rst respectively. The changelog is also included as part of my Sphinx documentation in docs/changelog.rst. But there’s no hard rule.

My long project description and thus PyPI text is also the README.rst which appears to be common nowadays.

If you host your project on GitHub, you may want to add a CONTRIBUTING.rst that gets displayed when someone wants to open a pull request. Have a look at attrs’s if you need inspiration.

Let’s Build Already!

Building a source distribution and a wheel of your project is just a matter of

$ rm -rf build
$ python sdist bdist_wheel

The first line accounts for a bug in wheel (pointed out to me by Michael Merickel) that won’t clean up the build directory between builds, which puts you at risk of shipping stale build files.

Uploading wheel versions of your package is optional and as of version 7.0.0, pip will wheel and cache your sdists automatically before installing them. That makes future installs much faster but it’s nicer to make the first install fast too by uploading wheels yourself.

Now you should have a new directory called dist containing a source distribution file and a wheel file. For attrs, it could look like this:

├── attrs-15.1.0-py2.py3-none-any.whl
└── attrs-15.1.0.tar.gz

You can test whether both install properly before we move on to uploading (the examples use UNIX commands; Windows would be similar):

$ rm -rf 27-sdist  # ensure clean state if ran repeatedly
$ virtualenv 27-sdist
$ 27-sdist/bin/pip install dist/attrs-15.1.0.tar.gz
Processing ./dist/attrs-15.1.0.tar.gz
Building wheels for collected packages: attrs
  Running bdist_wheel for attrs
  Stored in directory: /Users/hynek/Library/Caches/pip/wheels/67/a6/e3/...
Successfully built attrs
Installing collected packages: attrs
Successfully installed attrs-15.1.0
$ cd 27-sdist/
$ bin/python
Python 2.7.10 (default, Jun  5 2015, 10:57:55)
[GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import attr
>>> attr.__version__
$ cd ..


$ rm -rf 27-wheel  # ensure clean state if ran repeatedly
$ virtualenv 27-wheel
$ 27-wheel/bin/pip install dist/attrs-15.1.0-py2.py3-none-any.whl
Processing ./dist/attrs-15.1.0-py2.py3-none-any.whl
Installing collected packages: attrs
Successfully installed attrs-15.1.0
$ cd 27-wheel/
$ bin/python
Python 2.7.10 (default, Jun  5 2015, 10:57:55)
[GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import attr
>>> attr.__version__

Please note the lack of “Running bdist_wheel for attrs” in the second test. Yes, you can finally install packages without executing arbitrary code!

So you’re confident that your package is perfect? Let’s use the Test PyPI server to find out!

The PyPI Staging Server

Again, I’ll be using attrs as the example project name to avoid <your project name> everywhere.

First, sign up on the test server, you will receive a user name and a password. Please note that this is independent from the live servers. Thus you’ll have to re-register both yourself and your packages. It also gets cleaned from time to time so don’t be surprised if it suddenly doesn’t know about you or your projects anymore. Just re-register.

Next, create a ~/.pypirc consisting of:


repository =
username = <your user name goes here>
password = <your password goes here>

Then use

$ python register -r test

to register your project with the PyPI test server.

Finally, let’s use twine4 to safely upload our previously built distributions:

$ twine upload -r test -s dist/attrs-15.1.0*

The -s option tells twine to sign the package with your GnuPG key. Omit it if you do not want to do that.

Now test your packages again:

$ pip install -i attrs

Everything dandy? Then lets tackle the last step: putting it on the real PyPI!

The Final Step

First, register at PyPI, then complete your ~/.pypirc:


repository =
username = <your test user name goes here>
password = <your test password goes here>

repository =
username = <your production user name goes here>
password = <your production password goes here>

One last deep breath and let’s rock:

$ python register
$ twine upload -s dist/attrs-15.1.0*

And thus, your package is only a pip install away for everyone! Congratulations, do more of that!

Bonus tip: You can delete releases from PyPI but you cannot re-upload them under the same version number! So be careful before uploading and deleting: You can’t just replace a release through a different file.

Next Steps

The information herein will probably get you pretty far but if you get stuck, the current canonical truths for Python packaging are:


This article has been kindly proof-read by Lynn Root, Donald Stufft, Alex Gaynor, Thomas Heinrichsdobler, and Jannis Leidel. All mistakes are still mine though.


  1. Pronounced “pie pee eye”, or “cheese shop”, notpie pie”! ↩︎
  2. To end this, there are PEPs underway. Most notably PEP 0517 and PEP 0518. ↩︎
  3. As of PEP 0440, PyPI and the packaging ecosystem has opinions on the structure of the version string though. ↩︎
  4. There’re two main reasons I prefer twine over the old-school (sdist|bdist_wheel) upload: 1. it uses TLS on all Python versions while properly verifying certificates and 2. I can pre-build distributions, test them, and then upload them at once. ↩︎