You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Tensordict seems to be still not fully mature, but is already quite useful. If I am not mistaken, it supports most operations required to write a einops backend -- einsum is missing.
Before I (or somebody else) does the work to draft an implementation, I am wondering if this might be included in einops or if @arogozhnikov would either like to wait and see if tensordict gets merged into pytorch proper (pytorch/pytorch#112441) or if the missing einsum functionality would be a stopper.
For us, it would already be immensely helpful to be able to use rearrange on the batch dimensions of tensordicts.
Cheers
The text was updated successfully, but these errors were encountered:
not really. From brief look, min/max are also missing.
I guess it would 'just work' with einops with a simple backend
It is a no-go to depend on in-development lib in einops; torchdict isn't compatible with the way other libs are tested
array api won't work for this case
My suggestion: implement a backend (see TorchBackend for example), put it in a separate file. When you import that file, einops will pick up your backend.
If there are other cases of non-standard, but 'likely working' backends, I can think of mechanism to auto-import those.
Tensordicts are dicts of tensors with a common batch dimension, tensorclasses dataclasses of tensor.
https://pytorch.org/tensordict/
Tensordict seems to be still not fully mature, but is already quite useful. If I am not mistaken, it supports most operations required to write a einops backend -- einsum is missing.
Before I (or somebody else) does the work to draft an implementation, I am wondering if this might be included in einops or if @arogozhnikov would either like to wait and see if tensordict gets merged into pytorch proper (pytorch/pytorch#112441) or if the missing einsum functionality would be a stopper.
For us, it would already be immensely helpful to be able to use rearrange on the batch dimensions of tensordicts.
Cheers
The text was updated successfully, but these errors were encountered: