[Bug] Dynamic Varsize error when tune detection model with autotvm #10042
Labels
needs-triage
PRs or issues that need to be investigated by maintainers to find the right assignees to address it
type: bug
Hi all,
I am a beginner of TVM and I found it's suppressingly fast when deploy on mobile devices compared with other frameworks like ncnn.
However, when I tried to tune a detection model (YOLOX) with autotvm on PC's CPU, I got a type error from converting Virtual axis. The demo I followed is from https://tvm.apache.org/docs/reference/api/python/autotvm.html and the error comes from the code:
tasks = autotvm.task.extract_from_program(mod["main"], target=target, params=params)
I guess the error is from the resize operator used in FPN for detection models. But even I fixed all the input shapes, there still are dynamic axis size (any_dim: int32).
I wonder if there is a solution or workaround. Thank you.
The TVM I use is of version '0.9.dev0'.
The error log is as below:
Exception in thread Thread-1:
The text was updated successfully, but these errors were encountered: