A set of command line tools to help you keep your pip-based packages fresh,
even when you've pinned them. You do pin them, right? (In building your Python application and its dependencies for production, you want to make sure that your builds are predictable and deterministic.)
Similar to pip, pip-tools must be installed in each of your project's
virtual environments:
$ source /path/to/venv/bin/activate
(venv) $ python -m pip install pip-toolsNote: all of the remaining example commands assume you've activated your project's virtual environment.
The pip-compile command lets you compile a requirements.txt file from
your dependencies, specified in either pyproject.toml, setup.cfg,
setup.py, or requirements.in.
Run it with pip-compile or python -m piptools compile (or
pipx run --spec pip-tools pip-compile if pipx was installed with the
appropriate Python version). If you use multiple Python versions, you can also
run py -X.Y -m piptools compile on Windows and pythonX.Y -m piptools compile
on other systems.
pip-compile should be run from the same virtual environment as your
project so conditional dependencies that require a specific Python version,
or other environment markers, resolve relative to your project's
environment.
Note: If pip-compile finds an existing requirements.txt file that
fulfils the dependencies then no changes will be made, even if updates are
available. To compile from scratch, first delete the existing
requirements.txt file, or see
Updating requirements
for alternative approaches.
The pyproject.toml file is the
latest standard for configuring
packages and applications, and is recommended for new projects. pip-compile
supports both installing your project.dependencies as well as your
project.optional-dependencies. Thanks to the fact that this is an
official standard, you can use pip-compile to pin the dependencies
in projects that use modern standards-adhering packaging tools like
Setuptools, Hatch
or flit.
Suppose you have a 'foobar' Python application that is packaged using Setuptools,
and you want to pin it for production. You can declare the project metadata as:
[build-system]
requires = ["setuptools", "setuptools-scm"]
build-backend = "setuptools.build_meta"
[project]
requires-python = ">=3.9"
name = "foobar"
dynamic = ["dependencies", "optional-dependencies"]
[tool.setuptools.dynamic]
dependencies = { file = ["requirements.in"] }
optional-dependencies.test = { file = ["requirements-test.txt"] }
If you have a Django application that is packaged using Hatch, and you
want to pin it for production. You also want to pin your development tools
in a separate pin file. You declare django as a dependency and create an
optional dependency dev that includes pytest:
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "my-cool-django-app"
version = "42"
dependencies = ["django"]
[project.optional-dependencies]
dev = ["pytest"]You can produce your pin files as easily as:
$ pip-compile -o requirements.txt pyproject.toml
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# pip-compile --output-file=requirements.txt pyproject.toml
#
asgiref==3.6.0
# via django
django==4.1.7
# via my-cool-django-app (pyproject.toml)
sqlparse==0.4.3
# via django
$ pip-compile --extra dev -o dev-requirements.txt pyproject.toml
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# pip-compile --extra=dev --output-file=dev-requirements.txt pyproject.toml
#
asgiref==3.6.0
# via django
attrs==22.2.0
# via pytest
django==4.1.7
# via my-cool-django-app (pyproject.toml)
exceptiongroup==1.1.1
# via pytest
iniconfig==2.0.0
# via pytest
packaging==23.0
# via pytest
pluggy==1.0.0
# via pytest
pytest==7.2.2
# via my-cool-django-app (pyproject.toml)
sqlparse==0.4.3
# via django
tomli==2.0.1
# via pytestThis is great for both pinning your applications, but also to keep the CI of your open-source Python package stable.
pip-compile has also full support for setup.py- and
setup.cfg-based projects that use setuptools.
Just define your dependencies and extras as usual and run
pip-compile as above.
You can also use plain text files for your requirements (e.g. if you don't
want your application to be a package). To use a requirements.in file to
declare the Django dependency:
# requirements.in
django
Now, run pip-compile requirements.in:
$ pip-compile requirements.in
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# pip-compile requirements.in
#
asgiref==3.6.0
# via django
django==4.1.7
# via -r requirements.in
sqlparse==0.4.3
# via djangoAnd it will produce your requirements.txt, with all the Django dependencies
(and all underlying dependencies) pinned.
(updating-requirements)=
pip-compile generates a requirements.txt file using the latest versions
that fulfil the dependencies you specify in the supported files.
If pip-compile finds an existing requirements.txt file that fulfils the
dependencies then no changes will be made, even if updates are available.
To force pip-compile to update all packages in an existing
requirements.txt, run pip-compile --upgrade.
To update a specific package to the latest or a specific version use the
--upgrade-package or -P flag:
# only update the django package
$ pip-compile --upgrade-package django
# update both the django and requests packages
$ pip-compile --upgrade-package django --upgrade-package requests
# update the django package to the latest, and requests to v2.0.0
$ pip-compile --upgrade-package django --upgrade-package requests==2.0.0You can combine --upgrade and --upgrade-package in one command, to
provide constraints on the allowed upgrades. For example to upgrade all
packages whilst constraining requests to the latest version less than 3.0:
$ pip-compile --upgrade --upgrade-package 'requests<3.0'If you would like to use Hash-Checking Mode available in pip since
version 8.0, pip-compile offers --generate-hashes flag:
$ pip-compile --generate-hashes requirements.in
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# pip-compile --generate-hashes requirements.in
#
asgiref==3.6.0 \
--hash=sha256:71e68008da809b957b7ee4b43dbccff33d1b23519fb8344e33f049897077afac \
--hash=sha256:9567dfe7bd8d3c8c892227827c41cce860b368104c3431da67a0c5a65a949506
# via django
django==4.1.7 \
--hash=sha256:44f714b81c5f190d9d2ddad01a532fe502fa01c4cb8faf1d081f4264ed15dcd8 \
--hash=sha256:f2f431e75adc40039ace496ad3b9f17227022e8b11566f4b363da44c7e44761e
# via -r requirements.in
sqlparse==0.4.3 \
--hash=sha256:0323c0ec29cd52bceabc1b4d9d579e311f3e4961b98d174201d5622a23b85e34 \
--hash=sha256:69ca804846bb114d2ec380e4360a8a340db83f0ccf3afceeb1404df028f57268
# via djangoTo output the pinned requirements in a filename other than
requirements.txt, use --output-file. This might be useful for compiling
multiple files, for example with different constraints on django to test a
library with both versions using tox:
$ pip-compile --upgrade-package 'django<1.0' --output-file requirements-django0x.txt
$ pip-compile --upgrade-package 'django<2.0' --output-file requirements-django1x.txtOr to output to standard output, use --output-file=-:
$ pip-compile --output-file=- > requirements.txt
$ pip-compile - --output-file=- < requirements.in > requirements.txtAny valid pip flags or arguments may be passed on with pip-compile's
--pip-args option, e.g.
$ pip-compile requirements.in --pip-args "--retries 10 --timeout 30"You can define project-level defaults for pip-compile and pip-sync by
writing them to a configuration file in the same directory as your requirements
input files (or the current working directory if piping input from stdin).
By default, both pip-compile and pip-sync will look first
for a .pip-tools.toml file and then in your pyproject.toml. You can
also specify an alternate TOML configuration file with the --config option.
It is possible to specify configuration values both globally and command-specific.
For example, to by default generate pip hashes in the resulting
requirements file output, you can specify in a configuration file:
[tool.pip-tools]
generate-hashes = trueOptions to pip-compile and pip-sync that may be used more than once
must be defined as lists in a configuration file, even if they only have one
value.
pip-tools supports default values for all valid command-line flags
of its subcommands. Configuration keys may contain underscores instead of dashes,
so the above could also be specified in this format:
[tool.pip-tools]
generate_hashes = trueConfiguration defaults specific to pip-compile and pip-sync can be put beneath
separate sections. For example, to by default perform a dry-run with pip-compile:
[tool.pip-tools.compile] # "sync" for pip-sync
dry-run = trueThis does not affect the pip-sync command, which also has a --dry-run option.
Note that local settings take preference over the global ones of the same name,
whenever both are declared, thus this would also make pip-compile generate hashes,
but discard the global dry-run setting:
[tool.pip-tools]
generate-hashes = true
dry-run = true
[tool.pip-tools.compile]
dry-run = falseYou might be wrapping the pip-compile command in another script. To avoid
confusing consumers of your custom script you can override the update command
generated at the top of requirements files by setting the
CUSTOM_COMPILE_COMMAND environment variable.
$ CUSTOM_COMPILE_COMMAND="./pipcompilewrapper" pip-compile requirements.in
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# ./pipcompilewrapper
#
asgiref==3.6.0
# via django
django==4.1.7
# via -r requirements.in
sqlparse==0.4.3
# via djangoIf you have different environments that you need to install different but compatible packages for, then you can create layered requirements files and use one layer to constrain the other.
For example, if you have a Django project where you want the newest 2.1
release in production and when developing you want to use the Django debug
toolbar, then you can create two *.in files, one for each layer:
# requirements.in
django<2.2
At the top of the development requirements dev-requirements.in you use -c requirements.txt to constrain the dev requirements to packages already
selected for production in requirements.txt.
# dev-requirements.in
-c requirements.txt
django-debug-toolbar<2.2
First, compile requirements.txt as usual:
$ pip-compile
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# pip-compile
#
django==2.1.15
# via -r requirements.in
pytz==2023.3
# via django
Now compile the dev requirements and the requirements.txt file is used as
a constraint:
$ pip-compile dev-requirements.in
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# pip-compile dev-requirements.in
#
django==2.1.15
# via
# -c requirements.txt
# django-debug-toolbar
django-debug-toolbar==2.1
# via -r dev-requirements.in
pytz==2023.3
# via
# -c requirements.txt
# django
sqlparse==0.4.3
# via django-debug-toolbarAs you can see above, even though a 2.2 release of Django is available, the
dev requirements only include a 2.1 version of Django because they were
constrained. Now both compiled requirements files can be installed safely in
the dev environment.
To install requirements in production stage use:
$ pip-syncYou can install requirements in development stage by:
$ pip-sync requirements.txt dev-requirements.txtYou might use pip-compile as a hook for the pre-commit.
See pre-commit docs for instructions.
Sample .pre-commit-config.yaml:
repos:
- repo: https://github.com/jazzband/pip-tools
rev: 7.4.1
hooks:
- id: pip-compileYou might want to customize pip-compile args by configuring args and/or files, for example:
repos:
- repo: https://github.com/jazzband/pip-tools
rev: 7.4.1
hooks:
- id: pip-compile
files: ^requirements/production\.(in|txt)$
args: [--index-url=https://example.com, requirements/production.in]If you have multiple requirement files make sure you create a hook for each file.
repos:
- repo: https://github.com/jazzband/pip-tools
rev: 7.4.1
hooks:
- id: pip-compile
name: pip-compile setup.py
files: ^(setup\.py|requirements\.txt)$
- id: pip-compile
name: pip-compile requirements-dev.in
args: [requirements-dev.in]
files: ^requirements-dev\.(in|txt)$
- id: pip-compile
name: pip-compile requirements-lint.in
args: [requirements-lint.in]
files: ^requirements-lint\.(in|txt)$
- id: pip-compile
name: pip-compile requirements.in
args: [requirements.in]
files: ^requirements\.(in|txt)$Now that you have a requirements.txt, you can use pip-sync to update
your virtual environment to reflect exactly what's in there. This will
install/upgrade/uninstall everything necessary to match the
requirements.txt contents.
Run it with pip-sync or python -m piptools sync. If you use multiple
Python versions, you can also run py -X.Y -m piptools sync on Windows and
pythonX.Y -m piptools sync on other systems.
pip-sync must be installed into and run from the same virtual
environment as your project to identify which packages to install
or upgrade.
Be careful: pip-sync is meant to be used only with a
requirements.txt generated by pip-compile.
$ pip-sync
Uninstalling flake8-2.4.1:
Successfully uninstalled flake8-2.4.1
Collecting click==4.1
Downloading click-4.1-py2.py3-none-any.whl (62kB)
100% |................................| 65kB 1.8MB/s
Found existing installation: click 4.0
Uninstalling click-4.0:
Successfully uninstalled click-4.0
Successfully installed click-4.1To sync multiple *.txt dependency lists, just pass them in via command
line arguments, e.g.
$ pip-sync dev-requirements.txt requirements.txtPassing in empty arguments would cause it to default to requirements.txt.
Any valid pip install flags or arguments may be passed with pip-sync's
--pip-args option, e.g.
$ pip-sync requirements.txt --pip-args "--no-cache-dir --no-deps"Note: pip-sync will not upgrade or uninstall packaging tools like
setuptools, pip, or pip-tools itself. Use python -m pip install --upgrade
to upgrade those packages.
Generally, yes. If you want a reproducible environment installation available from your source control,
then yes, you should commit both requirements.in and requirements.txt to source control.
Note that if you are deploying on multiple Python environments (read the section below),
then you must commit a separate output file for each Python environment.
We suggest to use the {env}-requirements.txt format
(ex: win32-py3.7-requirements.txt, macos-py3.10-requirements.txt, etc.).
The dependencies of a package can change depending on the Python environment in which it is installed. Here, we define a Python environment as the combination of Operating System, Python version (3.7, 3.8, etc.), and Python implementation (CPython, PyPy, etc.). For an exact definition, refer to the possible combinations of PEP 508 environment markers.
As the resulting requirements.txt can differ for each environment, users must
execute pip-compile on each Python environment separately to generate a
requirements.txt valid for each said environment. The same requirements.in can
be used as the source file for all environments, using
PEP 508 environment markers as
needed, the same way it would be done for regular pip cross-environment usage.
If the generated requirements.txt remains exactly the same for all Python
environments, then it can be used across Python environments safely. But users
should be careful as any package update can introduce environment-dependent
dependencies, making any newly generated requirements.txt environment-dependent too.
As a general rule, it's advised that users should still always execute pip-compile
on each targeted Python environment to avoid issues.
pip-tools is a great tool to improve the reproducibility of builds.
But there are a few things to keep in mind.
pip-compilewill produce different results in different environments as described in the previous section.pipmust be used with thePIP_CONSTRAINTenvironment variable to lock dependencies in build environments as documented in #8439.- Dependencies come from many sources.
Continuing the pyproject.toml example from earlier, creating a single lock file could be done like:
$ pip-compile --all-build-deps --all-extras --output-file=constraints.txt --strip-extras pyproject.toml
#
# This file is autogenerated by pip-compile with Python 3.9
# by the following command:
#
# pip-compile --all-build-deps --all-extras --output-file=constraints.txt --strip-extras pyproject.toml
#
asgiref==3.5.2
# via django
attrs==22.1.0
# via pytest
backports-zoneinfo==0.2.1
# via django
django==4.1
# via my-cool-django-app (pyproject.toml)
editables==0.3
# via hatchling
hatchling==1.11.1
# via my-cool-django-app (pyproject.toml::build-system.requires)
iniconfig==1.1.1
# via pytest
packaging==21.3
# via
# hatchling
# pytest
pathspec==0.10.2
# via hatchling
pluggy==1.0.0
# via
# hatchling
# pytest
py==1.11.0
# via pytest
pyparsing==3.0.9
# via packaging
pytest==7.1.2
# via my-cool-django-app (pyproject.toml)
sqlparse==0.4.2
# via django
tomli==2.0.1
# via
# hatchling
# pytestSome build backends may also request build dependencies dynamically using the get_requires_for_build_ hooks described in PEP 517 and PEP 660.
This will be indicated in the output with one of the following suffixes:
(pyproject.toml::build-system.backend::editable)(pyproject.toml::build-system.backend::sdist)(pyproject.toml::build-system.backend::wheel)
- pip-compile-multi - pip-compile command wrapper for multiple cross-referencing requirements files.
- pipdeptree to print the dependency tree of the installed packages.
requirements.in/requirements.txtsyntax highlighting:- requirements.txt.vim for Vim.
- Python extension for VS Code for VS Code.
- pip-requirements.el for Emacs.
This section lists pip-tools features that are currently deprecated.
- In the next major release, the
--allow-unsafebehavior will be enabled by default (#989). Use--no-allow-unsafeto keep the old behavior. It is recommended to pass--allow-unsafenow to adapt to the upcoming change. - The legacy resolver is deprecated and will be removed in future versions.
The new default is
--resolver=backtracking. - In the next major release, the
--strip-extrasbehavior will be enabled by default (#1613). Use--no-strip-extrasto keep the old behavior.
You can choose from either default backtracking resolver or the deprecated legacy resolver.
The legacy resolver will occasionally fail to resolve dependencies. The backtracking resolver is more robust, but can take longer to run in general.
You can continue using the legacy resolver with --resolver=legacy although
note that it is deprecated and will be removed in a future release.