-
Notifications
You must be signed in to change notification settings - Fork 349
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🐛 [Bug] [Dynamic Shapes] Encountered bug when using Torch-TensorRT #3140
Comments
@narendasan can you help me slove these problem? I want to set the dynamic shape in batch size & seq_len |
@narendasan when to support torch_executed_modules in dynamo mode? |
Hi @yjjinjie you can set the dynamic shapes and pass in the dynamic inputs using
where
where the first two (1, 8, 16) and (1, 2, 3) denote the batch_size and seq_len respectively. Can you try with this and see if you get the same error as above? |
yes,I have tried the torch_tensorrt.Input. but it encountered a new bug
the error is:
|
I also tried the dynamic_shapes: https://pytorch.org/TensorRT/user_guide/dynamic_shapes.html
it has the same problem as the torch._dynamo.mark_dynamic(a, 0,min=1,max=8196) |
@apbose can you help me? |
Yeah sure, let me take a look and get back on this. |
Bug Description
when I use dynamic shape in trt, will raise error,
the static shape is ok.just delete these
To Reproduce
Steps to reproduce the behavior:
the env:
The text was updated successfully, but these errors were encountered: