-
Notifications
You must be signed in to change notification settings - Fork 26.3k
[quant] quantized path for ConstantPadNd #43304
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit 4d5c29b (more details on the Dr. CI page):
ci.pytorch.org: 1 failedThis comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group. This comment has been revised 27 times. |
[ghstack-poisoned]
| const auto memory_format = self.suggest_memory_format(); | ||
| output = at::_empty_affine_quantized( | ||
| new_shape, self.options().memory_format(memory_format), | ||
| self.q_scale(), self.q_zero_point(), c10::nullopt); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is the last argument memory format? why do we have that?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is required because if not explicitly specified as nullopt there will be two conflicting memory_format arguments -- one from the options and one as an argument.
Differential Revision: [D23231946](https://our.internmc.facebook.com/intern/diff/D23231946) [ghstack-poisoned]
Differential Revision: [D23231946](https://our.internmc.facebook.com/intern/diff/D23231946) [ghstack-poisoned]
Differential Revision: [D23231946](https://our.internmc.facebook.com/intern/diff/D23231946) [ghstack-poisoned]
Differential Revision: [D23231946](https://our.internmc.facebook.com/intern/diff/D23231946) [ghstack-poisoned]
Differential Revision: [D23231946](https://our.internmc.facebook.com/intern/diff/D23231946) [ghstack-poisoned]
Differential Revision: [D23231946](https://our.internmc.facebook.com/intern/diff/D23231946) [ghstack-poisoned]
Differential Revision: [D23231946](https://our.internmc.facebook.com/intern/diff/D23231946) [ghstack-poisoned]
Codecov Report
@@ Coverage Diff @@
## gh/z-a-f/54/base #43304 +/- ##
=================================================
Coverage 69.24% 69.24%
=================================================
Files 381 381
Lines 47573 47573
=================================================
Hits 32943 32943
Misses 14630 14630 Continue to review full report at Codecov.
|
Differential Revision: [D23231946](https://our.internmc.facebook.com/intern/diff/D23231946) [ghstack-poisoned]
Differential Revision: [D23231946](https://our.internmc.facebook.com/intern/diff/D23231946) [ghstack-poisoned]
|
Little help needed, can u tell me how to write my own quantized layer? I simply want to re-create quantized reflection padding? |
Stack from ghstack:
Differential Revision: D23231946