Today we look at one lightweight environment management tool: pip-tools

GitHub - jazzband/pip-tools: A set of tools to keep your pinned Python dependencies fresh.
A set of tools to keep your pinned Python dependencies fresh. - GitHub - jazzband/pip-tools: A set of tools to keep your pinned Python dependencies fresh.

If you are mostly happy with creating and managing virtual environments on your own but wished for just a few more features, then pip-tools is for you.

pip-tools consists of a few scripts that add on to the existing functionality from venv and pip

Installation

To use pip-tools, you first need to create a virtual environment with venv and then activate the virtuanenv.

Then install pip-tools inside the virtualenv using pip

> python -m pip install pip-tools
..

pip-compile

pip-tools comes with two scripts. The first is pip-compile.

This script will take the input list of dependencies. This can be in a requirements.in file if you are using a traditional setup or defined in pyproject.toml for a more modern standard.

It will take the dependencies defined in the file, resolve all the dependencies and generate a requirements.txt file as output. This output file contains the pinned versions of all the dependencies, so it is essentially a lock file.

Here is an example [project] section of pyproject.toml file for a blog website using pelican.

[project]
name = "pelican-blog"
version = "1.0"
authors = [{ name = "Author" }]
description = "Blog"
requires-python = ">=3.10"
dependencies = [
    "pelican ~= 4.8.0",
    "jsx-lexer ~= 2.0.0"
]

In the toml file we define pelican ~= 4.8.0 as a dependency (which means the latest version in the 4.x.x sequence).

When we run pip-compile on this, we get an output requirements.txt like this

#
# This file is autogenerated by pip-compile with Python 3.11
# by the following command:
#
#    pip-compile --output-file=requirements.txt --resolver=backtracking '.\pyproject.toml'
#
blinker==1.5
commonmark==0.9.1
docutils==0.19
feedgenerator==2.0.0
jinja2==3.1.2
jsx-lexer==2.0.0
markupsafe==2.1.1
pelican==4.8.0
pygments==2.13.0
python-dateutil==2.8.2
pytz==2022.7
rich==12.6.0
six==1.16.0
unidecode==1.3.6

Once in a while we will want to update to newer versions of the packages. Running pip-compile --upgrade will calculate the latest dependencise that match the constraints in pyproject.toml and generate a new requirements.txt lock file.

pip-compile also has a --generate-hashes parameter that will generate hashes and put them in requirements.txt. Here is a part of the file with the same pyproject.toml shown above

#
# This file is autogenerated by pip-compile with Python 3.11
# by the following command:
#
#    pip-compile --generate-hashes --output-file=requirements.txt --resolver=backtracking '.\pyproject.toml'
#
blinker==1.5 \
    --hash=sha256:1eb563df6fdbc39eeddc177d953203f99f097e9bf0e2b8f9f3cf18b6ca425e36 \
    --hash=sha256:923e5e2f69c155f2cc42dafbbd70e16e3fde24d2d4aa2ab72fbe386238892462
    # via pelican
commonmark==0.9.1 \
    --hash=sha256:452f9dc859be7f06631ddcb328b6919c67984aca654e5fefb3914d54691aed60 \
    --hash=sha256:da2f38c92590f83de410ba1a3cbceafbc74fee9def35f9251ba9a971d6d66fd9
    # via rich
docutils==0.19 \
    --hash=sha256:33995a6753c30b7f577febfc2c50411fec6aac7f7ffeb7c4cfe5991072dcf9e6 \
    --hash=sha256:5e1de4d849fee02c63b040a4a3fd567f4ab104defd8a5511fbbc24a8a017efbc
    # via pelican

When hashes are present, pip will verify the downloaded package with the hash to ensure that a corrupted or malicious package is not downloaded. Note that some packages might have different hashes for different distributions of the package (source distribution, wheel etc)

pip-sync

The other script provided by pip-tools is pip-sync. This tool simply takes your requirements.txt file and makes sure that the installed packages match that. This could involve installing new packages, upgrading (or downgrading) existing packages, or even removing packages that are no longer required. Think of it like a supercharged pip.

You might think why not just use pip for this? After all, pip has support for reading a list of packages in requirements.txt and installing them.

The issue with pip comes when transitive dependencies are involved. Suppose we no longer require jsx-lexer in our application. We could do pip uninstall jsx-lexer and that would remove the package, but what about the transitive dependencies? They will still be around.

The same workflow with pip-tools would be as follows:

  • First, remove the dependency from pyproject.toml
  • Then run pip-compile to determine the new set of dependencies. This will create a new requirements.txt file without the dependency or its transitive dependency. But if any transitive dependency is required by any other dependency then it will be retained
  • Finally, run pip-sync to sync the packages in the virtual environment to what is mentioned in requirements.txt, any extra packages in the environment will be deleted

Multiple Environment Support

Most projects have multiple environments. You may want to install only production dependencies if you are only going to run the application, whereas if you will be developing it you might want additional testing or documentation tools to be installed.

Typically such dependencies are configured using the [project.optional-dependencies] section of pyproject.toml like this

[project.optional-dependencies]
dev = ["pytest"]

You can then pass the --extra dev option to pip-compile to consider the additional dependencies specified for the dev environment of pyproject.toml. Make sure you write the output to a different file like dev-requirements.txt. Then check in both files into source control.

When replicating the environment later on, you can use just pip-sync to install only the packages in requirements.txt or specify the dev file with pip-sync dev-requirements.txt and get the extra dev packages as well.

Summary

If you have been using pip and venv for a long while and like how they work, but wish for a few more features like pinned packages, reproducible builds and secure builds, then pip-tools is the ideal solution for you.

pip-tools doesn't do anything complex. It just adds two scripts that work in conjunction with your existing setup. There is no mental overhead of learning a whole new environment management tool, or re-configuring your project. Just install pip-tools in your virtual environment and you are ready to go.

Did you like this article?

If you liked this article, consider subscribing to this site. Subscribing is free.

Why subscribe? Here are three reasons:

  1. You will get every new article as an email in your inbox, so you never miss an article
  2. You will be able to comment on all the posts, ask questions, etc
  3. Once in a while, I will be posting conference talk slides, longer form articles (such as this one), and other content as subscriber-only