-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Int64 BroadCast-ArgMax triggers assertion error at graph runtime #11794
Comments
@ganler Sorry if I understood improperly but the situation here seems to be that the broadcast operation is mismatching, if you want it to output a |
The example here is not doing integer casting. If we directly use relay, we can of course mark the broadcasting shape as 'int32'. However, external model formats like onnx often bring such 'int64' things (it is not strongly supported but still valid in the relay language) into the relay frontend which fails compilation. |
@ganler Thanks for answering, I'm investigating a possible fix but just to be clear on what is expected here then is that |
@everton1984 Thanks for the interest! There are many interger mismatch bugs in TVM happening in compilation stage: I feel this bug is also related, though it happens in execution phase (I am less familiar with codegen so I did not locate the bug). Usually the fix is to add some integer promotion to places that assume the IR only brings |
Issue apache#11794. Fix ArgReduce automatic return type inference by forcing it to use the datatype of the shape of the Tensor instead of the fixed Int32. Including additional tests.
The following miminized code snippet (broadcast-argmax) triggers error in graph runtime (i.e., can compile but runtime binary cannot run). At compile time,
te/schedule/bound.cc:119
says "not in feed graph consumer" as warnings. Seems to be the bugs in codegen?cc te's code owners: @junrushao1994 @vinx13 @masahi
Expected behavior
Accepts inputs at runtime.
Actual behavior
Environment
Ubuntu 20.04. commit tag: 9bba758
The text was updated successfully, but these errors were encountered: