Skip to content

Conversation

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 9, 2022

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/88745

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit d1f664b:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@anjali411 anjali411 added the release notes: composability release notes category label Nov 9, 2022
@ezyang
Copy link
Contributor

ezyang commented Nov 9, 2022

Test?

anjali411 added a commit that referenced this pull request Nov 9, 2022
ghstack-source-id: 2b6faf3
Pull Request resolved: #88745
TODO: add an OpInfo

[ghstack-poisoned]
anjali411 added a commit that referenced this pull request Nov 10, 2022
ghstack-source-id: f75a00e
Pull Request resolved: #88745
TODO: add an OpInfo

[ghstack-poisoned]
TODO: add an OpInfo

[ghstack-poisoned]
TODO: add an OpInfo

[ghstack-poisoned]
TODO: add an OpInfo

[ghstack-poisoned]
@albanD albanD removed their request for review November 14, 2022 15:22
TODO: add an OpInfo

[ghstack-poisoned]
@anjali411 anjali411 requested a review from zou3519 as a code owner November 15, 2022 11:06
@anjali411
Copy link
Contributor Author

@pytorchbot merge -g

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Nov 15, 2022
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks on your PR pass since you used the green (-g) flag (ETA: 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: The following mandatory check(s) failed (Rule superuser):

Dig deeper by viewing the failures on hud

Details for Dev Infra team Raised by workflow job

TODO: add an OpInfo

[ghstack-poisoned]
TODO: add an OpInfo

[ghstack-poisoned]

@register_meta([aten.round.default, aten.round.decimals])
def meta_round(self, **kwargs):
return _elementwise_meta(self, type_promotion=ELEMENTWISE_PRIM_TYPE_PROMOTION_KIND.DEFAULT)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mruberry @ezyang is it ok to reuse this function that's used to find the correct meta information (strides, dtype). It seems like the meta and prim implementation for all elementwise functions should be the same?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is the intention, yes, but that meta function is intended for prims.round:

round = _make_elementwise_unary_prim(

I'm assuming this is adding a meta function for refs.round, torch.round, or aten.round? In that case we need to look at what refs.round does that prims.round does not:

# TODO: round takes additional kwargs

But luckily in this case it just performs "default" type promotion, which is a computation detail and doesn't effect the output's metadata.

Anyway, in this case I think you can directly use the meta function for elementwise prims. Note that if you wanted to write a meta function for an operation like refs.add then the meta function for prims.add would not be correct. A helper for writing meta functions for elementwise torch/aten/reference operations could probably be written without too much trouble, though.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that if you wanted to write a meta function for an operation like refs.add then the meta function for prims.add would not be correct.

how come?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The prims don't support features like type promotion, and torch.add/refs.add in particular has unique behavior for handling its alpha parameter

TODO: add an OpInfo

[ghstack-poisoned]
TODO: add an OpInfo

[ghstack-poisoned]
@anjali411
Copy link
Contributor Author

@pytorchbot merge -g

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks on your PR pass since you used the green (-g) flag (ETA: 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

TODO: add an OpInfo

cc mlazos soumith voznesenskym yanboliang penguinwu anijain2305 EikanWang jgong5 Guobing-Chen chunyuan-w XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: New commits were pushed while merging. Please rerun the merge command.

Details for Dev Infra team Raised by workflow job

@anjali411
Copy link
Contributor Author

@pytorchbot merge -g

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks on your PR pass since you used the green (-g) flag (ETA: 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

kulinseth pushed a commit to kulinseth/pytorch that referenced this pull request Dec 10, 2022
@facebook-github-bot facebook-github-bot deleted the gh/anjali411/220/head branch June 8, 2023 15:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk Trigger trunk jobs on your pull request Merged module: inductor release notes: composability release notes category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants