Skip to main content

No project description provided

Project description

Docs - GitHub.io Benchmarks Python version GitHub license pypi version pypi nightly version Downloads Downloads codecov circleci Conda - Platform Conda (channel only)

TensorDict

Installation | General features | Tensor-like features | Distributed capabilities | TensorDict for functional programming | **TensorDict for parameter serialization | Lazy preallocation | Nesting TensorDicts | TensorClass

TensorDict is a dictionary-like class that inherits properties from tensors, such as indexing, shape operations, casting to device or point-to-point communication in distributed settings.

The main purpose of TensorDict is to make code-bases more readable and modular by abstracting away tailored operations:

for i, data in enumerate(dataset):
    # the model reads and writes tensordicts
    data = model(data)
    loss = loss_module(data)
    loss.backward()
    optimizer.step()
    optimizer.zero_grad()

With this level of abstraction, one can recycle a training loop for highly heterogeneous task. Each individual step of the training loop (data collection and transform, model prediction, loss computation etc.) can be tailored to the use case at hand without impacting the others. For instance, the above example can be easily used across classification and segmentation tasks, among many others.

Features

General

A tensordict is primarily defined by its batch_size (or shape) and its key-value pairs:

>>> from tensordict import TensorDict
>>> import torch
>>> data = TensorDict({
...     "key 1": torch.ones(3, 4, 5),
...     "key 2": torch.zeros(3, 4, 5, dtype=torch.bool),
... }, batch_size=[3, 4])

The batch_size and the first dimensions of each of the tensors must be compliant. The tensors can be of any dtype and device. Optionally, one can restrict a tensordict to live on a dedicated device, which will send each tensor that is written there:

>>> data = TensorDict({
...     "key 1": torch.ones(3, 4, 5),
...     "key 2": torch.zeros(3, 4, 5, dtype=torch.bool),
... }, batch_size=[3, 4], device="cuda:0")
>>> data["key 3"] = torch.randn(3, 4, device="cpu")
>>> assert data["key 3"].device is torch.device("cuda:0")

But that is not all, you can also store nested values in a tensordict:

>>> data["nested", "key"] = torch.zeros(3, 4) # the batch-size must match

and any nested tuple structure will be unravelled to make it easy to read code and write ops programmatically:

>>> data["nested", ("supernested", ("key",))] = torch.zeros(3, 4) # the batch-size must match
>>> assert (data["nested", "supernested", "key"] == 0).all()
>>> assert (("nested",), "supernested", (("key",),)) in data.keys(include_nested=True)  # this works too!

You can also store non-tensor data in tensordicts:

>>> data = TensorDict({"a-tensor": torch.randn(1, 2)}, batch_size=[1, 2])
>>> data["non-tensor"] = "a string!"
>>> assert data["non-tensor"] == "a string!"

Tensor-like features

TensorDict objects can be indexed exactly like tensors. The resulting of indexing a TensorDict is another TensorDict containing tensors indexed along the required dimension:

>>> data = TensorDict({
...     "key 1": torch.ones(3, 4, 5),
...     "key 2": torch.zeros(3, 4, 5, dtype=torch.bool),
... }, batch_size=[3, 4])
>>> sub_tensordict = data[..., :2]
>>> assert sub_tensordict.shape == torch.Size([3, 2])
>>> assert sub_tensordict["key 1"].shape == torch.Size([3, 2, 5])

Similarly, one can build tensordicts by stacking or concatenating single tensordicts:

>>> tensordicts = [TensorDict({
...     "key 1": torch.ones(3, 4, 5),
...     "key 2": torch.zeros(3, 4, 5, dtype=torch.bool),
... }, batch_size=[3, 4]) for _ in range(2)]
>>> stack_tensordict = torch.stack(tensordicts, 1)
>>> assert stack_tensordict.shape == torch.Size([3, 2, 4])
>>> assert stack_tensordict["key 1"].shape == torch.Size([3, 2, 4, 5])
>>> cat_tensordict = torch.cat(tensordicts, 0)
>>> assert cat_tensordict.shape == torch.Size([6, 4])
>>> assert cat_tensordict["key 1"].shape == torch.Size([6, 4, 5])

TensorDict instances can also be reshaped, viewed, squeezed and unsqueezed:

>>> data = TensorDict({
...     "key 1": torch.ones(3, 4, 5),
...     "key 2": torch.zeros(3, 4, 5, dtype=torch.bool),
... }, batch_size=[3, 4])
>>> print(data.view(-1))
torch.Size([12])
>>> print(data.reshape(-1))
torch.Size([12])
>>> print(data.unsqueeze(-1))
torch.Size([3, 4, 1])

One can also send tensordict from device to device, place them in shared memory, clone them, update them in-place or not, split them, unbind them, expand them etc.

If a functionality is missing, it is easy to call it using apply() or apply_():

tensordict_uniform = data.apply(lambda tensor: tensor.uniform_())

apply() can also be great to filter a tensordict, for instance:

data = TensorDict({"a": torch.tensor(1.0, dtype=torch.float), "b": torch.tensor(1, dtype=torch.int64)}, [])
data_float = data.apply(lambda x: x if x.dtype == torch.float else None) # contains only the "a" key
assert "b" not in data_float

Distributed capabilities

Complex data structures can be cumbersome to synchronize in distributed settings. tensordict solves that problem with synchronous and asynchronous helper methods such as recv, irecv, send and isend that behave like their torch.distributed counterparts:

>>> # on all workers
>>> data = TensorDict({"a": torch.zeros(()), ("b", "c"): torch.ones(())}, [])
>>> # on worker 1
>>> data.isend(dst=0)
>>> # on worker 0
>>> data.irecv(src=1)

When nodes share a common scratch space, the MemmapTensor backend can be used to seamlessly send, receive and read a huge amount of data.

TensorDict for functional programming

We also provide an API to use TensorDict in conjunction with FuncTorch. For instance, TensorDict makes it easy to concatenate model weights to do model ensembling:

>>> from torch import nn
>>> from tensordict import TensorDict
>>> import torch
>>> from torch import vmap
>>> layer1 = nn.Linear(3, 4)
>>> layer2 = nn.Linear(4, 4)
>>> model = nn.Sequential(layer1, layer2)
>>> params = TensorDict.from_module(model)
>>> # we represent the weights hierarchically
>>> weights1 = TensorDict(layer1.state_dict(), []).unflatten_keys(".")
>>> weights2 = TensorDict(layer2.state_dict(), []).unflatten_keys(".")
>>> assert (params == TensorDict({"0": weights1, "1": weights2}, [])).all()
>>> # Let's use our functional module
>>> x = torch.randn(10, 3)
>>> with params.to_module(model):
...     out = model(x)
>>> # an ensemble of models: we stack params along the first dimension...
>>> params_stack = torch.stack([params, params], 0)
>>> # ... and use it as an input we'd like to pass through the model
>>> def func(x, params):
...     with params.to_module(model):
...         return model(x)
>>> y = vmap(func, (None, 0))(x, params_stack)
>>> print(y.shape)
torch.Size([2, 10, 4])

Moreover, tensordict modules are compatible with torch.fx and (soon) torch.compile, which means that you can get the best of both worlds: a codebase that is both readable and future-proof as well as efficient and portable!

TensorDict for parameter serialization and building datasets

TensorDict offers an API for parameter serialization that can be >3x faster than regular calls to torch.save(state_dict). Moreover, because tensors will be saved independently on disk, you can deserialize your checkpoint on an arbitrary slice of the model.

>>> model = nn.Sequential(nn.Linear(3, 4), nn.Linear(4, 3))
>>> params = TensorDict.from_module(model)
>>> params.memmap("/path/to/saved/folder/", num_threads=16)  # adjust num_threads for speed
>>> # load params
>>> params = TensorDict.load_memmap("/path/to/saved/folder/", num_threads=16)
>>> params.to_module(model)  # load onto model
>>> params["0"].to_module(model[0])  # load on a slice of the model
>>> # in the latter case we could also have loaded only the slice we needed
>>> params0 = TensorDict.load_memmap("/path/to/saved/folder/0", num_threads=16)
>>> params0.to_module(model[0])  # load on a slice of the model

The same functionality can be used to access data in a dataset stored on disk. Soring a single contiguous tensor on disk accessed through the tensordict.MemoryMappedTensor primitive and reading slices of it is not only much faster than loading single files one at a time but it's also easier and safer (because there is no pickling or third-party library involved):

# allocate memory of the dataset on disk
data = TensorDict({
    "images": torch.zeros((128, 128, 3), dtype=torch.uint8),
    "labels": torch.zeros((), dtype=torch.int)}, batch_size=[])
data = data.expand(1000000)
data = data.memmap_like("/path/to/dataset")
# ==> Fill your dataset here
# Let's get 3 items of our dataset:
data[torch.tensor([1, 10000, 500000])]  # This is much faster than loading the 3 images independently

Preprocessing with TensorDict.map

Preprocessing huge contiguous (or not!) datasets can be done via TensorDict.map which will dispatch a task to various workers:

import torch
from tensordict import TensorDict, MemoryMappedTensor
import tempfile

def process_data(data):
    images = data.get("images").flip(-2).clone()
    labels = data.get("labels") // 10
    # we update the td inplace
    data.set_("images", images)  # flip image
    data.set_("labels", labels)  # cluster labels

if __name__ == "__main__":
    # create data_preproc here
    data_preproc = data.map(process_data, num_workers=4, chunksize=0, pbar=True)  # process 1 images at a time

Lazy preallocation

Pre-allocating tensors can be cumbersome and hard to scale if the list of preallocated items varies according to the script configuration. TensorDict solves this in an elegant way. Assume you are working with a function foo() -> TensorDict, e.g.

def foo():
    data = TensorDict({}, batch_size=[])
    data["a"] = torch.randn(3)
    data["b"] = TensorDict({"c": torch.zeros(2)}, batch_size=[])
    return data

and you would like to call this function repeatedly. You could do this in two ways. The first would simply be to stack the calls to the function:

data = torch.stack([foo() for _ in range(N)])

However, you could also choose to preallocate the tensordict:

data = TensorDict({}, batch_size=[N])
for i in range(N):
    data[i] = foo()

which also results in a tensordict (when N = 10)

TensorDict(
    fields={
        a: Tensor(torch.Size([10, 3]), dtype=torch.float32),
        b: TensorDict(
            fields={
                c: Tensor(torch.Size([10, 2]), dtype=torch.float32)},
            batch_size=torch.Size([10]),
            device=None,
            is_shared=False)},
    batch_size=torch.Size([10]),
    device=None,
    is_shared=False)

When i==0, your empty tensordict will automatically be populated with empty tensors of batch-size N. After that, updates will be written in-place. Note that this would also work with a shuffled series of indices (pre-allocation does not require you to go through the tensordict in an ordered fashion).

Nesting TensorDicts

It is possible to nest tensordict. The only requirement is that the sub-tensordict should be indexable under the parent tensordict, i.e. its batch size should match (but could be longer than) the parent batch size.

We can switch easily between hierarchical and flat representations. For instance, the following code will result in a single-level tensordict with keys "key 1" and "key 2.sub-key":

>>> data = TensorDict({
...     "key 1": torch.ones(3, 4, 5),
...     "key 2": TensorDict({"sub-key": torch.randn(3, 4, 5, 6)}, batch_size=[3, 4, 5])
... }, batch_size=[3, 4])
>>> tensordict_flatten = data.flatten_keys(separator=".")

Accessing nested tensordicts can be achieved with a single index:

>>> sub_value = data["key 2", "sub-key"]

TensorClass

Content flexibility comes at the cost of predictability. In some cases, developers may be looking for data structure with a more explicit behavior. tensordict provides a dataclass-like decorator that allows for the creation of custom dataclasses that support the tensordict operations:

>>> from tensordict.prototype import tensorclass
>>> import torch
>>>
>>> @tensorclass
... class MyData:
...    image: torch.Tensor
...    mask: torch.Tensor
...    label: torch.Tensor
...
...    def mask_image(self):
...        return self.image[self.mask.expand_as(self.image)].view(*self.batch_size, -1)
...
...    def select_label(self, label):
...        return self[self.label == label]
...
>>> images = torch.randn(100, 3, 64, 64)
>>> label = torch.randint(10, (100,))
>>> mask = torch.zeros(1, 64, 64, dtype=torch.bool).bernoulli_().expand(100, 1, 64, 64)
>>>
>>> data = MyData(images, mask, label=label, batch_size=[100])
>>>
>>> print(data.select_label(1))
MyData(
    image=Tensor(torch.Size([11, 3, 64, 64]), dtype=torch.float32),
    label=Tensor(torch.Size([11]), dtype=torch.int64),
    mask=Tensor(torch.Size([11, 1, 64, 64]), dtype=torch.bool),
    batch_size=torch.Size([11]),
    device=None,
    is_shared=False)
>>> print(data.mask_image().shape)
torch.Size([100, 6117])
>>> print(data.reshape(10, 10))
MyData(
    image=Tensor(torch.Size([10, 10, 3, 64, 64]), dtype=torch.float32),
    label=Tensor(torch.Size([10, 10]), dtype=torch.int64),
    mask=Tensor(torch.Size([10, 10, 1, 64, 64]), dtype=torch.bool),
    batch_size=torch.Size([10, 10]),
    device=None,
    is_shared=False)

As this example shows, one can write a specific data structures with dedicated methods while still enjoying the TensorDict artifacts such as shape operations (e.g. reshape or permutations), data manipulation (indexing, cat and stack) or calling arbitrary functions through the apply method (and many more).

Tensorclasses support nesting and, in fact, all the TensorDict features.

Installation

With Pip:

To install the latest stable version of tensordict, simply run

pip install tensordict

This will work with Python 3.7 and upward as well as PyTorch 1.12 and upward.

To enjoy the latest features, one can use

pip install tensordict-nightly

With Conda:

Install tensordict from conda-forge channel.

conda install -c conda-forge tensordict

Citation

If you're using TensorDict, please refer to this BibTeX entry to cite this work:

@misc{bou2023torchrl,
      title={TorchRL: A data-driven decision-making library for PyTorch}, 
      author={Albert Bou and Matteo Bettini and Sebastian Dittert and Vikash Kumar and Shagun Sodhani and Xiaomeng Yang and Gianni De Fabritiis and Vincent Moens},
      year={2023},
      eprint={2306.00577},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

Disclaimer

TensorDict is at the beta-stage, meaning that there may be bc-breaking changes introduced, but they should come with a warranty. Hopefully these should not happen too often, as the current roadmap mostly involves adding new features and building compatibility with the broader PyTorch ecosystem.

License

TensorDict is licensed under the MIT License. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

tensordict-0.3.0-cp312-cp312-macosx_11_0_arm64.whl (580.6 kB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

tensordict-0.3.0-cp311-cp311-win_amd64.whl (264.9 kB view details)

Uploaded CPython 3.11Windows x86-64

tensordict-0.3.0-cp311-cp311-manylinux1_x86_64.whl (1.0 MB view details)

Uploaded CPython 3.11

tensordict-0.3.0-cp311-cp311-macosx_11_0_arm64.whl (583.1 kB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

tensordict-0.3.0-cp311-cp311-macosx_10_9_universal2.whl (324.0 kB view details)

Uploaded CPython 3.11macOS 10.9+ universal2 (ARM64, x86-64)

tensordict-0.3.0-cp310-cp310-win_amd64.whl (264.4 kB view details)

Uploaded CPython 3.10Windows x86-64

tensordict-0.3.0-cp310-cp310-manylinux1_x86_64.whl (1.0 MB view details)

Uploaded CPython 3.10

tensordict-0.3.0-cp310-cp310-macosx_11_0_arm64.whl (581.8 kB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

tensordict-0.3.0-cp310-cp310-macosx_10_15_x86_64.whl (265.8 kB view details)

Uploaded CPython 3.10macOS 10.15+ x86-64

tensordict-0.3.0-cp39-cp39-win_amd64.whl (264.0 kB view details)

Uploaded CPython 3.9Windows x86-64

tensordict-0.3.0-cp39-cp39-manylinux1_x86_64.whl (1.0 MB view details)

Uploaded CPython 3.9

tensordict-0.3.0-cp39-cp39-macosx_11_0_x86_64.whl (266.0 kB view details)

Uploaded CPython 3.9macOS 11.0+ x86-64

tensordict-0.3.0-cp39-cp39-macosx_11_0_arm64.whl (581.9 kB view details)

Uploaded CPython 3.9macOS 11.0+ ARM64

tensordict-0.3.0-cp38-cp38-win_amd64.whl (264.3 kB view details)

Uploaded CPython 3.8Windows x86-64

tensordict-0.3.0-cp38-cp38-manylinux1_x86_64.whl (1.0 MB view details)

Uploaded CPython 3.8

tensordict-0.3.0-cp38-cp38-macosx_11_0_x86_64.whl (265.7 kB view details)

Uploaded CPython 3.8macOS 11.0+ x86-64

tensordict-0.3.0-cp38-cp38-macosx_11_0_arm64.whl (585.2 kB view details)

Uploaded CPython 3.8macOS 11.0+ ARM64

File details

Details for the file tensordict-0.3.0-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tensordict-0.3.0-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 c5e6daf347ca7048687358aa2f71c3e70e39f692fa2241f8e2980a140a4509c5
MD5 87fdafdfc7262f960c28c145a6b92a0b
BLAKE2b-256 02e938f3ee604d4901c931111206f246a50a8c345989dc7043b0d022a95a98f1

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: tensordict-0.3.0-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 264.9 kB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for tensordict-0.3.0-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 bae590c30dc3d39833217f2eb7f7e5f84da22e49c216cd3b67255e9149fe278f
MD5 81c6512f25e836bf774d7d9efe820e34
BLAKE2b-256 2cf604ff80bbca8cbd13fda3de681926ebc11fb669271c80728a2f7e8d666590

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp311-cp311-manylinux1_x86_64.whl.

File metadata

File hashes

Hashes for tensordict-0.3.0-cp311-cp311-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 340fad70ce38c3c58dba597bc23ad54ae55a5b0cacf60258c6d757a507ba80b9
MD5 e297a6b636edc8fde0b4299b57d97cbe
BLAKE2b-256 7cd93477c998c6dff788a9e657af4c769adda000cf0945d1904c3c28e6c6a970

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tensordict-0.3.0-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 73152793cef87f705fe873c606c1d0336ed5597edb241bf148d7ba4091907853
MD5 b12686561b92d3aed0095c33a4dea7e9
BLAKE2b-256 e1abd5fcee1e00b82967d60471126cc0e439fa8c25a80c49148c3553e20a3c13

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp311-cp311-macosx_10_9_universal2.whl.

File metadata

File hashes

Hashes for tensordict-0.3.0-cp311-cp311-macosx_10_9_universal2.whl
Algorithm Hash digest
SHA256 748b42d714d80286e0151956b977dd4ad5efc252d8ad60831c0106550739c44e
MD5 96d6cfe5da885611b4e47783132ca400
BLAKE2b-256 8bf5b0998c2c5a8b711299efc041ad7ed2152a8e05a680cf4c97d54ddfd88f5f

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: tensordict-0.3.0-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 264.4 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for tensordict-0.3.0-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 986d9ccf3472d553814b40aea45f0bb8e15e8d9a230dcf3540ffcab57e472929
MD5 6cf2040a5f33fa5e0e922d93c1d32928
BLAKE2b-256 feca88f24cbe87548b63b7f5446ee4018ab276a17b7c2296a9d8ce73e993551c

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp310-cp310-manylinux1_x86_64.whl.

File metadata

File hashes

Hashes for tensordict-0.3.0-cp310-cp310-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 59e24e505498379c7ff04cd218df5ade709361707e6f199fa019ab89acb84e96
MD5 de82e7db3b24f42cb17e2b36476a0982
BLAKE2b-256 f7c5a4a3ca8a43059dc859982f4dd7988709cd63989cc790c82aaa4a15521bb3

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tensordict-0.3.0-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 4bee18cad5483c8f93c67c1cdf5e66b95dca30627ba54d5865fd3ee206bc14ac
MD5 04bed892cb17068b6f1c1fdbe9de0850
BLAKE2b-256 216c35758da82e67bc5f8b6ffe542d7f9ce9193b0edc5a60ad2c4901374220df

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp310-cp310-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for tensordict-0.3.0-cp310-cp310-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 be92a4d8c2a05bf4da881f45679510e461bb028d0d81c1d7430f55ecc27c64fa
MD5 375ac51500d6d3efc41011e79833279a
BLAKE2b-256 d138a2bb06a5a744b8d3d244283a0650abb89ccb61294ff7bbac7d470f94d7be

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: tensordict-0.3.0-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 264.0 kB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for tensordict-0.3.0-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 dfb7e184d2c349753277fa65eb85c9f416a3aef4c1b252ebbea979ffc70c0d9f
MD5 49778124a8748f2c9bee697a9a514c91
BLAKE2b-256 cfcfde2ef84c82b2ac54a646576685176d2501bb522aa1037849c56fccbb6d21

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp39-cp39-manylinux1_x86_64.whl.

File metadata

File hashes

Hashes for tensordict-0.3.0-cp39-cp39-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 4c1081f1467739674410aab1bc8cdd177834afa3dcef357df0e1c75c34ee9e30
MD5 75ebe1b3d41c1e1014c8c2adfe5092fb
BLAKE2b-256 185fca1551446bd16c141150907b63d2d073e7ec3b5bea8a567dc2fb40c14c77

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp39-cp39-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for tensordict-0.3.0-cp39-cp39-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 859c3a7f274997c2ff49641d6f2e08d1ad9f7e7f16200deeeff33a22c3cc31c5
MD5 a268e43621e26f213c6e41603150472a
BLAKE2b-256 406dc0e8357410de074a631f5babdb958d4f69f90a43108d7ac4dfe8f84db96b

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tensordict-0.3.0-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 07099a59cbdc241aa72fe71d7c68c36f3f46c2eab4e9b1b4b196e1620d26c24f
MD5 1b8a0b4af9ea483804311a3af6ee4efe
BLAKE2b-256 3cd1ed06c5211733e73288d4e17d1dafbd74dd385a35a9b68b9bbf6ad9a62fdd

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: tensordict-0.3.0-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 264.3 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for tensordict-0.3.0-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 280e4da61e69e18bdf5c64c9f2b09b4fc865375a8215e74a42b341c1d38c22a9
MD5 9b40e81c543c649110c95e964fd2f442
BLAKE2b-256 81ee6ae612e543662299480a3dae5edd9187d9e8c43227dab92262dfac14d4a4

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp38-cp38-manylinux1_x86_64.whl.

File metadata

File hashes

Hashes for tensordict-0.3.0-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 0f577d189caa87aea952d116cb265a4b2b95e277c3b701c7f5d4dc2ee7dcce4a
MD5 ccd782dbda33e8cc1e59a400c1cc3755
BLAKE2b-256 44aad34505c8a92bf4a1262040df6db3484d0f34fb4806b8475c2b749f441735

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp38-cp38-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for tensordict-0.3.0-cp38-cp38-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 1319454944fc2079316f13a18dbfad2e725f9424b334094f8476199832b799fd
MD5 c2446d864be2c37ea8f9881b62c05336
BLAKE2b-256 d0443000261cddf69761c0a4aa7dfcef4e12f61020cb97dfde343280b902e93f

See more details on using hashes here.

File details

Details for the file tensordict-0.3.0-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tensordict-0.3.0-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 bb45d5497bc138f180487a8d28787250273f4f1080b6dab3ce4685e953536867
MD5 c3db5f61749c5fcbb71d80c0bdc3c800
BLAKE2b-256 83496780de21fff593777a651245ff208533d4b1d84551d7eae71d3a0a8175e4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page