Skip to content

Conversation

@desertfire
Copy link
Contributor

@desertfire desertfire commented Feb 15, 2024

Summary: Move these util functions from an anonymous namespace to a common header so that later torchgen-ed files can use them.

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Feb 15, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/119987

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit bb830b9 with merge base 9c597ff (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

desertfire added a commit that referenced this pull request Feb 15, 2024
Summary: Move these util functions from an anonymous namespace to a common header so that later torchgen-ed files can use them.

ghstack-source-id: 70ff068
Pull Request resolved: #119987
Summary: Move these util functions from an anonymous namespace to a common header so that later torchgen-ed files can use them.

[ghstack-poisoned]
Summary: Move these util functions from an anonymous namespace to a common header so that later torchgen-ed files can use them.

[ghstack-poisoned]
Summary: Move these util functions from an anonymous namespace to a common header so that later torchgen-ed files can use them.

[ghstack-poisoned]
@desertfire
Copy link
Contributor Author

@desertfire has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

Summary: Move these util functions from an anonymous namespace to a common header so that later torchgen-ed files can use them.

Differential Revision: [D54258088](https://our.internmc.facebook.com/intern/diff/D54258088)

[ghstack-poisoned]
@desertfire
Copy link
Contributor Author

@desertfire has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

Summary: Move these util functions from an anonymous namespace to a common header so that later torchgen-ed files can use them.

Differential Revision: [D54258088](https://our.internmc.facebook.com/intern/diff/D54258088)

[ghstack-poisoned]
@desertfire
Copy link
Contributor Author

@desertfire has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

Copy link

@chenyang78 chenyang78 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks!

@desertfire
Copy link
Contributor Author

@desertfire has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

2 similar comments
@desertfire
Copy link
Contributor Author

@desertfire has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@desertfire
Copy link
Contributor Author

@desertfire has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@pytorchbot merge -f 'Landed internally'

(Initiating merge automatically since Phabricator Diff has merged, using force because this PR might not pass merge_rules.json but landed internally)

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

pytorchmergebot pushed a commit that referenced this pull request Feb 29, 2024
Summary: Switch codegen for sdpa to always point to v2 in the C shim. Since aoti_torch__scaled_dot_product_flash_attention_v2 has been introduced for a while, there shouldn't be any FC issue in production.

Differential Revision: [D54258090](https://our.internmc.facebook.com/intern/diff/D54258090)
Pull Request resolved: #120592
Approved by: https://github.com/chenyang78
ghstack dependencies: #119987
pytorchmergebot pushed a commit that referenced this pull request Feb 29, 2024
Summary: Currently the logics for filling the default value for optional arguments are scattered in several places. By storing OpOverload in the base ExternKernel class, we can simplify codegen_kwargs, and this is a preparation step for enabling the torchgen-ed C shim. The default value filling logic for FallbackKernel can also be simplified, but that can come later.

Differential Revision: [D54258089](https://our.internmc.facebook.com/intern/diff/D54258089)
Pull Request resolved: #120629
Approved by: https://github.com/chenyang78
ghstack dependencies: #119987, #120592
@github-actions github-actions bot deleted the gh/desertfire/335/head branch March 31, 2024 01:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants