Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] use einops torch layers with torch_tensorrt compile failed #354

Open
Jason3900 opened this issue Dec 17, 2024 · 4 comments
Open

[BUG] use einops torch layers with torch_tensorrt compile failed #354

Jason3900 opened this issue Dec 17, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@Jason3900
Copy link

Describe the bug
I use torch_tensorrt.compile to compile my vqgan model which contains einops ops. Since the torch.jit.ScriptModule does not allow arbitrary args specification **kwargs, I use from einops.layers.torch import Rearrange to create layers instead. However, it still throw the following error.

Traceback (most recent call last):
  File "/xxx/convert_trt.py", line 102, in main
    trt_model = torch_tensorrt.compile(
  File "/usr/local/lib/python3.10/dist-packages/torch_tensorrt/_compile.py", line 212, in compile
    compiled_ts_module: torch.jit.ScriptModule = torchscript_compile(
  File "/usr/local/lib/python3.10/dist-packages/torch_tensorrt/ts/_compiler.py", line 154, in compile
    compiled_cpp_mod = _C.compile_graph(module._c, _parse_compile_spec(spec))
RuntimeError: Unknown type Dict[int, __torch__.einops.einops.TransformRecipe] encountered in graph lowering. This type is not supported in ONNX export.

I've tried to set:

from einops._torch_specific import allow_ops_in_compiled_graph  # requires einops>=0.6.1
allow_ops_in_compiled_graph()

But it didn't work.

@Jason3900 Jason3900 added the bug Something isn't working label Dec 17, 2024
@arogozhnikov
Copy link
Owner

arogozhnikov commented Dec 17, 2024

einops layers work in torch.compile, torch.jit.script and can be traced with torch.jit.trace.

I don't know details of torch_tensorrt, but I recommend you to start with reporting to their project.

According to pytorch/TensorRT#1005 there is an interest in supporting einops, but they seemingly never did this.

@Jason3900
Copy link
Author

einops layers work in torch.compile, torch.jit.script and can be traced with torch.jit.trace.

I don't know details of torch_tensorrt, but I recommend you to start with reporting to their project.

According to pytorch/TensorRT#1005 there is an interest in supporting einops, but they seemingly never did this.

Thanks! But for now, is there a workround to make it work?

@arogozhnikov
Copy link
Owner

Thanks! But for now, is there a workround to make it work?

once again - ask pytorch_tensorrt. I never seen or used it, I don't know why their compilation process does not respect torch's compile registration, etc.

@Jason3900
Copy link
Author

got it, thanks~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants