Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG FIX] Remove unnecessary padding during NMS #2584

Conversation

LuukvandenBent
Copy link
Contributor

Motivation

During deployment of RtmDet-ins models, the output breaks when there are fewer predictions than the "keep_top X" setting. This is caused by padding done to the bbox but not to the masks, so when the top_k are selected, there are more bboxes than masks. This (and previous PR) is a fix for that.

Related to #2571 and #2574.

Modification

Removing unnecessary padding of bbox which is not done to mask.

@@ -349,10 +349,6 @@ def _multiclass_nms_single(boxes: Tensor,
dets = torch.cat([boxes, scores], dim=2)
labels = cls_inds.unsqueeze(0)

# pad
dets = torch.cat((dets, dets.new_zeros((1, 1, 5))), 1)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this padding is useful for case when selected_indices is zero shaped tensor.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants