Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature suggestion] Support of tensordict (and tensorclass) #343

Open
fzimmermann89 opened this issue Sep 28, 2024 · 3 comments
Open

[Feature suggestion] Support of tensordict (and tensorclass) #343

fzimmermann89 opened this issue Sep 28, 2024 · 3 comments

Comments

@fzimmermann89
Copy link

Tensordicts are dicts of tensors with a common batch dimension, tensorclasses dataclasses of tensor.

https://pytorch.org/tensordict/

Tensordict seems to be still not fully mature, but is already quite useful. If I am not mistaken, it supports most operations required to write a einops backend -- einsum is missing.

Before I (or somebody else) does the work to draft an implementation, I am wondering if this might be included in einops or if @arogozhnikov would either like to wait and see if tensordict gets merged into pytorch proper (pytorch/pytorch#112441) or if the missing einsum functionality would be a stopper.

For us, it would already be immensely helpful to be able to use rearrange on the batch dimensions of tensordicts.

Cheers

@arogozhnikov
Copy link
Owner

arogozhnikov commented Sep 29, 2024

Hi Felix,

missing einsum functionality would be a stopper.

not really. From brief look, min/max are also missing.

  • I guess it would 'just work' with einops with a simple backend
  • It is a no-go to depend on in-development lib in einops; torchdict isn't compatible with the way other libs are tested
  • array api won't work for this case

My suggestion: implement a backend (see TorchBackend for example), put it in a separate file. When you import that file, einops will pick up your backend.

If there are other cases of non-standard, but 'likely working' backends, I can think of mechanism to auto-import those.

@arogozhnikov
Copy link
Owner

also, don't forget to update this issue if it worked or if there are other hurdles

@vmoens
Copy link

vmoens commented Dec 18, 2024

We added min/max/amin/amax.

The upstreaming to core is still roadmapped.

Is there anything I can do on the tensordict side to unblock this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants