-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Enable type-checking of torch.nn.quantized.* modules #43110
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable type-checking of torch.nn.quantized.* modules #43110
Conversation
💊 CI failures summary and remediationsAs of commit 1f98bac (more details on the Dr. CI page):
ci.pytorch.org: 1 failedThis comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group. This comment has been revised 49 times. |
|
@guilhermeleobas did you somehow open this PR by mistake? It's the same as gh-43068. |
|
@rgommers I couldn't revert the submodule update in the other PR. I will close the old one in favor of this. |
rgommers
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @guilhermeleobas. One thing to fix - splitting annotations between a .py and a .pyi file isn't a good idea.
|
Re last commit: in this case it's a small change so no action needed here, but in general it's better not to mix inlining annotations with changes that fix errors. I edited https://github.com/pytorch/pytorch/wiki/Guide-for-adding-type-annotations-to-PyTorch#setting-up-and-checking-mypy-works to make this clearer. |
|
Okay, thanks for the heads up. In the future, I will just create another PR if this case comes up again. :) |
Codecov Report
@@ Coverage Diff @@
## master #43110 +/- ##
=======================================
Coverage 69.24% 69.25%
=======================================
Files 381 381
Lines 47577 47582 +5
=======================================
+ Hits 32947 32951 +4
- Misses 14630 14631 +1
Continue to review full report at Codecov.
|
rgommers
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Fixes #43029
I am not changing the following files in this PR:
torch/nn/quantized/dynamic/modules/rnn.pydue to LSTM::permute_hidden breaks Liskov substitution principle #43072torch/nn/quantized/modules/conv.py