Install PyTorch distributions with computation backend auto-detection
Project description
light-the-torch
light-the-torch is a small utility that wraps pip to ease the installation process
for PyTorch distributions and third-party packages that depend on them. It auto-detects
compatible CUDA versions from the local setup and installs the correct PyTorch binaries
without user interference.
Why do I need it?
PyTorch distributions are fully pip install'able, but PyPI, the default pip search
index, has some limitations:
- PyPI regularly only allows binaries up to a size of approximately 60 MB. One can request a file size limit increase (and the PyTorch team probably does that for every release), but it is still not enough: although PyTorch has pre-built binaries for Windows with CUDA, they cannot be installed through PyPI due to their size.
- PyTorch uses local version specifiers to indicate for which computation backend the
binary was compiled, for example
torch==1.11.0+cpu. Unfortunately, local specifiers are not allowed on PyPI. Thus, only the binaries compiled with one CUDA version are uploaded without an indication of the CUDA version. If you do not have a CUDA capable GPU, downloading this is only a waste of bandwidth and disk capacity. If on the other hand your NVIDIA driver version simply doesn't support the CUDA version the binary was compiled with, you can't use any of the GPU features.
To overcome this, PyTorch also hosts all binaries
themselves. To access them, you
still can pip install them, but have to use some
additional options:
pip install torch==1.11.0+cu113 -f https://download.pytorch.org/whl/cu113/torch_stable.html
While this is certainly an improvement, it also has it downside: in addition to the
computation backend, the version has to be specified exactly. Without knowing what the
latest release is, it is impossible to install it as simple as pip install torch
normally would.
At this point you might justifiably ask: why don't you just use conda as PyTorch
recommends?
conda install pytorch torchvision torchaudio cudatoolkit=11.3 -c pytorch
This should cover all cases, right? Well, no. Let's first have a look at this from a
user perspective: You still have to manually specify the computation backend. Of course,
you can drop the =11.3 above, but this would give you the latest cudatoolkit, which
might not be compatible with the local setup. Plus, if you don't have access to a GPU,
you need to manually swap cudatoolkit for cpuonly.
Now imagine you are the author of a library that depends on PyTorch and targets a
broader audience than just experts: the scenario above can be bad, because new users
need to jump through additional hoops. In addition, new users might not be familiar with
conda, whereas almost every Python tutorial features a short introduction to pip.
If you go conda nonetheless, you'll now have to decide how you want to publish your
library. The obvious choice would be to publish on the
conda-forge channel to benefit from all their
infrastructure. Unfortunately, this is not as easy as it sounds: conda-forge does not
allow your package to depend on packages hosted in different channels. PyTorch publishes
the binaries to their own channels (-c pytorch) and so you cannot depend on the
official binaries. There is a
community package, but it only
publishes CPU binaries. Additionally, there are a few
binaries with CUDA support, but the range is
limited with no support for Windows and only selected CUDA versions for Linux. Thus, if
you don't want to limit you options, you would have to setup and maintain your own
channel.
If any of the stuff doesn't sound appealing to you and you just want to have the same
user experience as pip install for PyTorch distributions, light-the-torch was made
for you.
How do I install it?
Installing light-the-torch is as easy as
pip install light-the-torch
Since it depends on pip and it might be upgraded during installation,
Windows users should install
it with
python -m pip install light-the-torch
How do I use it?
After light-the-torch is installed you can use its CLI interface ltt as drop-in
replacement for pip:
ltt install torch
In fact, ltt is pip with a few added options:
-
By default,
lttuses the local NVIDIA driver version to select the correct binary for you. You can pass the--pytorch-computation-backendoption to manually specify the computation backend you want to use:ltt install --pytorch-computation-backend=cu102 torch
-
By default,
lttinstalls stable PyTorch binaries. In addition, PyTorch provides nightly, test, and long-time support (LTS) binaries. You can switch the channel you want to install from with the--pytorch-channeloption:ltt install --pytorch-channel=nightly torch
If the channel option is not passed, using
pip's builtin--preoption will install PyTorch test binaries.
Of course you are not limited to install only PyTorch distributions. Everything shown above also works if you install packages that depend on PyTorch:
ltt install --pytorch-computation-backend=cpu --pytorch-channel=nightly pystiche
How does it work?
The authors of pip do not condone the use of pip internals as they might break
without warning. As a results of this, pip has no capability for plugins to hook into
specific tasks.
light-the-torch works by monkey-patching pip internals at runtime:
- While searching for a download link for a PyTorch distribution,
light-the-torchreplaces the default search index with an official PyTorch download link. This is equivalent to callingpip installwith the-foption only for PyTorch distributions. - While evaluating possible PyTorch installation candidates,
light-the-torchculls binaries not compatible with the available hardware.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file light_the_torch-0.4.0rc4.tar.gz.
File metadata
- Download URL: light_the_torch-0.4.0rc4.tar.gz
- Upload date:
- Size: 11.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.2.0 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.7.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f8cb45ba5fc87ff983159d2380b63df794d8a6a8b2b3744d5b822568f9f9f262
|
|
| MD5 |
8f600766439865dc52c83184a37e0623
|
|
| BLAKE2b-256 |
07106691427f858eb981b84afd3c231dc733fab852fda673e6e001ace9d833ce
|
File details
Details for the file light_the_torch-0.4.0rc4-py3-none-any.whl.
File metadata
- Download URL: light_the_torch-0.4.0rc4-py3-none-any.whl
- Upload date:
- Size: 12.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.2.0 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.7.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1d98dec787cbf3377e49d8df74d98cbedf843a9b3a71df502a169123d74903d9
|
|
| MD5 |
247f648c08a1889456f9280992a5a4d6
|
|
| BLAKE2b-256 |
648f4d347781c4dc63ade61a15cae0ca9e50f7e8cc6e071df6b5776cef3bddc3
|