Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Use torch.inference_mode() for TableQA #3731

Merged
merged 3 commits into from
Dec 19, 2022
Merged

Conversation

sjrl
Copy link
Contributor

@sjrl sjrl commented Dec 19, 2022

Related Issues

Proposed Changes:

  • Swapped out all torch.no_grad() calls with torch.inference_mode()
  • Updated forward pass of RCIReader to use with torch.inference_mode()

How did you test it?

Existing tests

Notes for the reviewer

Checklist

@sjrl sjrl requested a review from a team as a code owner December 19, 2022 10:54
@sjrl sjrl requested review from mayankjobanputra and removed request for a team December 19, 2022 10:54
Copy link
Member

@julian-risch julian-risch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks very good to me! 👍 Thanks for picking up the topic so quickly after my comment. Great to see how we benefit from upgrading the transformers version in different parts of the code.

@julian-risch julian-risch removed the request for review from mayankjobanputra December 19, 2022 11:36
@sjrl sjrl merged commit d7fabb5 into main Dec 19, 2022
@sjrl sjrl deleted the tableqa-inference-mode branch December 19, 2022 12:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants