Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

❓ [Question] How to save tensorrt engine ? #3348

Open
lzcchl opened this issue Jan 8, 2025 · 1 comment
Open

❓ [Question] How to save tensorrt engine ? #3348

lzcchl opened this issue Jan 8, 2025 · 1 comment
Labels
question Further information is requested

Comments

@lzcchl
Copy link

lzcchl commented Jan 8, 2025

❓ Question

I had already save torch.jit model and infer with pytorch backend successful, but I had tried find some example in project and issue, but I can not find any case, code, example, tutorial to show how to save a tensorrt engine for running by tensorrt backend, can you help me?

@lzcchl lzcchl added the question Further information is requested label Jan 8, 2025
@narendasan
Copy link
Collaborator

narendasan commented Jan 8, 2025

https://pytorch.org/TensorRT/py_api/torch_tensorrt.html use torch_tensorrt.convert_method_to_trt_engine

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants