-
Notifications
You must be signed in to change notification settings - Fork 26.3k
[quant] Prep for conv_transpose packing #39714
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit 582205a (more details on the Dr. CI page):
Extra GitHub checks: 1 failed
ci.pytorch.org: 1 failedThis comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group. This comment has been revised 184 times. |
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
vkuzo
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks reasonable, ideally we could review this for real when the serialization story is agreed upon
| kSpatialDim, | ||
| "D convolution."); | ||
| const int output_channels = weight.size(0); | ||
| const int output_channels_idx = transpose ? 1 : 0; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
optional: if you want these to be more readable, you could define an enum ConvType which can be ConvType::CONV | ConvType::CONV_TRANSPOSE and pass that around instead
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I love this solution -- I was originally thinking of casting boolean to int (as in int(transpose)), but thought that would be confusing when used as a channel index
| {(uint32_t)dilation_[1], (uint32_t)dilation_[0]}, | ||
| {(uint32_t)padding_[0], (uint32_t)padding_[1], | ||
| (uint32_t)padding_[0], (uint32_t)padding_[1]}, | ||
| /*adjustment=*/{0, 0}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what does "adjustment" mean here / is it ok to replace with output padding?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea, the terminology sucks -- technically, this is adjustment to the output size, which is the same as the output_padding. The only difference is that adjustment refers to the size in general, while padding refers to the padding to every side separately.
Basically, I am using asjustment here because the qnnpack uses this as terminology, while pytorch uses "padding" -- should I change it?
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
Differential Revision: [D22087071](https://our.internmc.facebook.com/intern/diff/D22087071) [ghstack-poisoned]
vkuzo
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lg if CI passes
Codecov Report
@@ Coverage Diff @@
## gh/z-a-f/35/base #39714 +/- ##
===================================================
Coverage ? 69.34%
===================================================
Files ? 381
Lines ? 47170
Branches ? 0
===================================================
Hits ? 32712
Misses ? 14458
Partials ? 0 Continue to review full report at Codecov.
|
Stack from ghstack:
Differential Revision: D22087071