-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Add meta impl for grid_sampler_2d_backward #88745
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
[ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/88745
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit d1f664b: This comment was automatically generated by Dr. CI and updates every 15 minutes. |
[ghstack-poisoned]
|
Test? |
[ghstack-poisoned]
TODO: add an OpInfo [ghstack-poisoned]
TODO: add an OpInfo [ghstack-poisoned]
TODO: add an OpInfo [ghstack-poisoned]
TODO: add an OpInfo [ghstack-poisoned]
TODO: add an OpInfo [ghstack-poisoned]
TODO: add an OpInfo [ghstack-poisoned]
|
@pytorchbot merge -g |
Merge startedYour change will be merged once all checks on your PR pass since you used the green (-g) flag (ETA: 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Merge failedReason: The following mandatory check(s) failed (Rule Dig deeper by viewing the failures on hud Details for Dev Infra teamRaised by workflow job |
TODO: add an OpInfo [ghstack-poisoned]
TODO: add an OpInfo [ghstack-poisoned]
torch/_meta_registrations.py
Outdated
|
|
||
| @register_meta([aten.round.default, aten.round.decimals]) | ||
| def meta_round(self, **kwargs): | ||
| return _elementwise_meta(self, type_promotion=ELEMENTWISE_PRIM_TYPE_PROMOTION_KIND.DEFAULT) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is the intention, yes, but that meta function is intended for prims.round:
pytorch/torch/_prims/__init__.py
Line 810 in 5faa279
| round = _make_elementwise_unary_prim( |
I'm assuming this is adding a meta function for refs.round, torch.round, or aten.round? In that case we need to look at what refs.round does that prims.round does not:
pytorch/torch/_refs/__init__.py
Line 778 in 5faa279
| # TODO: round takes additional kwargs |
But luckily in this case it just performs "default" type promotion, which is a computation detail and doesn't effect the output's metadata.
Anyway, in this case I think you can directly use the meta function for elementwise prims. Note that if you wanted to write a meta function for an operation like refs.add then the meta function for prims.add would not be correct. A helper for writing meta functions for elementwise torch/aten/reference operations could probably be written without too much trouble, though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note that if you wanted to write a meta function for an operation like refs.add then the meta function for prims.add would not be correct.
how come?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The prims don't support features like type promotion, and torch.add/refs.add in particular has unique behavior for handling its alpha parameter
TODO: add an OpInfo [ghstack-poisoned]
TODO: add an OpInfo [ghstack-poisoned]
|
@pytorchbot merge -g |
Merge startedYour change will be merged once all checks on your PR pass since you used the green (-g) flag (ETA: 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
TODO: add an OpInfo cc mlazos soumith voznesenskym yanboliang penguinwu anijain2305 EikanWang jgong5 Guobing-Chen chunyuan-w XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire [ghstack-poisoned]
Merge failedReason: New commits were pushed while merging. Please rerun the merge command. Details for Dev Infra teamRaised by workflow job |
|
@pytorchbot merge -g |
Merge startedYour change will be merged once all checks on your PR pass since you used the green (-g) flag (ETA: 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
TODO: add an OpInfo Pull Request resolved: pytorch#88745 Approved by: https://github.com/ezyang
Stack from ghstack (oldest at bottom):
TODO: add an OpInfo
cc @mlazos @soumith @voznesenskym @yanboliang @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @chunyuan-w @XiaobingSuper @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @peterbell10 @desertfire