
Jakub Kriz
Software Engineer & Python Enthusiast
Dependency Management in Python: pip-tools
Why pip-tools?
Python’s dependency management has long been a pain point. Unlike languages like Rust or Go, and even JavaScript, Python lacks a built-in lock file mechanism, leading to two key issues:
Ambiguity: Traditional requirements.txt files conflate direct dependencies (what your project explicitly needs) with transitive dependencies (what your dependencies pull in).
Reproducibility: Without pinned versions,
pip install
can yield different environments over time, causing inconsistent behavior where code functions correctly in one environment but fails in others.
pip-tools solves these problems by augmenting (not replacing) the standard pip workflow:
Direct dependencies (requirements.in): What your project explicitly uses, with version constraints you control.
Lock file (requirements.txt): All dependencies (direct and transitive) pinned with exact versions for reproducibility.
While tools like Poetry and Pipenv offer similar features, pip-tools requires minimal changes to existing workflows and can be adopted incrementally.
Note: The files don’t have to be named requirements.in
and requirements.txt
, but it’s the most intuitive choice since people are already accustomed to installing dependencies from requirements.txt
.
Quick Start
1. Install pip-tools
pip install pip-tools
Note: You’ll typically want to install pip-tools in your virtual environment rather than globally, ensuring version consistency across your team.
2. Define Direct Dependencies
# requirements.in
requests>=2.0
pydantic~=2.0
Note: The
.in
file contains only your direct dependencies with flexible version specifiers. You can use:
package>=1.0
: Minimum version (1.0 or newer)package~=1.0
: Compatible release (1.x, but not 2.0)package==1.2.3
: Exact version- Just
package
: Latest version (generally not recommended for production code as it may cause unpredictable updates)
3. Generate a Lock File
pip-compile --generate-hashes # Creates requirements.txt
Note: This command resolves all dependencies and their dependencies, creating a locked requirements.txt with exact versions. The
--generate-hashes
flag adds cryptographic hashes for security verification. You can also use other flags like:
--output-file=requirements-dev.txt
: Specify a different output filename--upgrade
: Update all packages to their latest compatible versions--upgrade-package requests
: Update only specific packages--annotation-style=line
: Add comments showing which top-level dependency required each package
4. Sync Your Environment
pip-sync requirements.txt # Exact replica every time
Note: Unlike
pip install -r requirements.txt
which only adds or updates packages,pip-sync
makes your environment match the requirements.txt exactly - installing, updating, AND removing packages as needed. This ensures absolute consistency across all environments.
5. Updating Dependencies
To update all dependencies to their latest compatible versions:
pip-compile --upgrade requirements.in
To update just a single package:
pip-compile --upgrade-package requests requirements.in
Tip: Use
--upgrade-package package==1.2.3
to pin a package to a specific version.
Multi-Environment Setup
A common pattern is to separate your dependencies by environment. This separation is practical because:
- Development tools like test frameworks, linters, and type checkers shouldn’t be installed in production environments
- Production-only dependencies like web servers and performance monitoring tools aren’t needed during development
- Keeping environments lean improves installation speed and reduces attack surface in production
For example, you wouldn’t want to install PyTest, Black, or MyPy in your production environment, but they’re essential for development. Similarly, production might need Gunicorn and Uvicorn which aren’t necessary for local development.
Here’s how you might structure this:
# base.in
requests>=2.25.0
pydantic>=2.0.0
# dev.in
-c base.txt
black==24.0.0
pytest>=7.0.0
mypy
# prod.in
-c base.txt
gunicorn>=20.1.0
uvicorn[standard]>=0.20.0
Compile each file in order:
# First compile base
pip-compile base.in
# Then compile dependent files
pip-compile dev.in
pip-compile prod.in
Warning: Always compile constraint files (-c base.txt) before compiling the files that depend on them. If you update base.in, recompile all dependent files afterward.
Note: The
-c base.txt
constraint flag tells pip-tools to use the exact versions from base.txt when the same packages appear in multiple environments, preventing version conflicts.
Component-Based Dependency Management
Modern Python projects often combine multiple components with different dependency needs: libraries, applications, scripts, and tooling. pip-tools excels at managing these varied requirements while ensuring compatibility across your project.
For example, in a data science project, you might have:
- A core library with minimal dependencies for wide compatibility
- Data processing scripts that depend on heavy packages like pandas and numpy
- Visualization tools requiring matplotlib and specialized plotting libraries
- CI/CD automation needing only testing frameworks
With pip-tools, you can create an optimized dependency structure for each component:
project/
├── my_library/
│ └── ...
├── analysis/
│ ├── process.py
│ └── visualize.py
├── requirements/
│ ├── library.in
│ ├── analysis.in
│ ├── viz.in
│ └── dev.in
This approach allows precise dependency management for each component:
# library.in
requests>=2.0
pydantic>=2.0.0
# analysis.in
-c library.txt
pandas>=2.0.0
numpy
scikit-learn
# viz.in
-c library.txt
matplotlib
seaborn
plotly
# dev.in
-c library.txt
pytest
black
mypy
This modular approach offers several advantages:
- Minimized dependency footprint: Users only install what they need for their specific task
- Improved compatibility: Core libraries can maintain fewer dependencies
- Faster installations: CI/CD pipelines can install only what’s needed for each test suite
- Clearer organization: Dependencies are explicitly tied to project components
- Simplified onboarding: New team members can focus on just the components they need
For example, a contributor working only on the core library could run:
pip-sync requirements/library.txt requirements/dev.txt
While a data scientist using the analysis tools would run:
pip-sync requirements/analysis.txt
This approach scales well in complex projects and teams, where different members may work on different components with different dependency needs.
Comparison with Alternatives
Choose pip-tools when:
- You want lock files without changing existing workflows
- Your team already uses pip/venv
- You need per-environment control (dev vs prod)
- You want to adopt better practices incrementally
Conclusion
When evaluating Python dependency management tools, pip-tools stands out as a pragmatic solution that balances modern features with workflow compatibility.
Why pip-tools works
Evolutionary, not revolutionary: Rather than replacing existing workflows, pip-tools enhances them while keeping familiar tools and commands.
Minimal cognitive overhead: Direct dependencies go in .in files, compiled output goes in .txt files—simple and effective.
Precise version control: Lock files ensure exact reproducibility across environments.
Flexibility and adaptability: Works with everything from simple scripts to complex packages without forcing structure.
The ecosystem perspective
The Python packaging landscape continues to evolve. While Poetry and PDM offer integrated solutions, pip-tools provides modern features without requiring a complete workflow change.
UV, a specialized Python package manager written in Rust, offers dramatic speed improvements in dependency resolution and installation but is still maturing. The good news: pip-tools’ approach is compatible with UV, meaning you can adopt pip-tools now and potentially incorporate UV later as it stabilizes.
Choose pip-tools when you want incremental improvement with maximum compatibility. Consider alternatives when starting fresh projects or when specific features like integrated environment management are essential.