-
Notifications
You must be signed in to change notification settings - Fork 26.3k
[MPS] Add group_norm[fwd+backward] and mean_var (take 2)
#91190
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Use Prims to implement group_norm, group_norm_backward and mean_var Use `torch._ops.ops` instead of `torch.ops` in numerous subpackages in order to be able to register python implementation in mps backend TODO: Fix MPS var implementation and add it to prims Fixes #88331 [ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/91190
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit f7ce741: This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
I wonder if it'll be better done using |
mean_vargroup_norm[fwd+backward] and mean_var
Use Prims to implement group_norm, group_norm_backward and mean_var Use `torch._ops.ops` instead of `torch.ops` in numerous subpackages in order to be able to register python implementation in mps backend TODO: Fix MPS var implementation and add it to prims Fixes #88331 [ghstack-poisoned]
Use Prims to implement group_norm, group_norm_backward and mean_var Use `torch._ops.ops` instead of `torch.ops` in numerous subpackages in order to be able to register python implementation in mps backend TODO: Fix MPS var implementation and add it to prims Fixes #88331 ghstack-source-id: 1437e0e Pull Request resolved: #91190
Use Prims to implement group_norm, group_norm_backward and mean_var Use `torch._ops.ops` instead of `torch.ops` in numerous subpackages in order to be able to register python implementation in mps backend TODO: Fix MPS var implementation and add it to prims Fixes #88331 [ghstack-poisoned]
albanD
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you explain in more details what is the difference between torch.ops and torch._ops.ops ?
As far as I can tell they are exactly the same thing.
The new registration looks good.
|
@pytorchbot merge -f "MPS tests are passing" |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
|
@pytorchbot revert -m "Broke test_correct_module_names because of underscore _ops" -c weird |
|
@pytorchbot successfully started a revert job. Check the current status here. |
|
@malfet your PR has been successfully reverted. |
This reverts commit 371716e. Reverted #91190 on behalf of https://github.com/kit1980 due to Broke test_correct_module_names because of underscore _ops
Use Prims to implement group_norm, group_norm_backward and mean_var Use `torch._ops.ops` instead of `torch.ops` in numerous subpackages in order to be able to make them importable from `torch/backend/mps/__init__.py` as this alias is defined in https://github.com/pytorch/pytorch/blob/15af4b1ceea3b689354ad664675b930486804c8e/torch/__init__.py#L1095 is executed last during init process. Depends on #91203 Fixes #88331 [ghstack-poisoned]
group_norm[fwd+backward] and mean_vargroup_norm[fwd+backward] and mean_var (take 2)
Use Prims to implement group_norm, group_norm_backward and mean_var Use `torch._ops.ops` instead of `torch.ops` in numerous subpackages in order to be able to make them importable from `torch/backend/mps/__init__.py` as this alias is defined in https://github.com/pytorch/pytorch/blob/15af4b1ceea3b689354ad664675b930486804c8e/torch/__init__.py#L1095 is executed last during init process. Depends on #91203 Fixes #88331 [ghstack-poisoned]
Use Prims to implement group_norm, group_norm_backward and mean_var Use `torch._ops.ops` instead of `torch.ops` in numerous subpackages in order to be able to make them importable from `torch/backend/mps/__init__.py` as this alias is defined in https://github.com/pytorch/pytorch/blob/15af4b1ceea3b689354ad664675b930486804c8e/torch/__init__.py#L1095 is executed last during init process. Add `__all__` to `torch/backends/mps/__init__.py` as well as alias all imports as private Add `TestNNMPS.test_group_norm_backward` that validates no NaNs are generated during the backward pass Fixes #88331 [ghstack-poisoned]
Use Prims to implement group_norm, group_norm_backward and mean_var Use `torch._ops.ops` instead of `torch.ops` in numerous subpackages in order to be able to register python implementation in mps backend TODO: Fix MPS var implementation and add it to prims Fixes #88331 ghstack-source-id: 43325e0 Pull Request resolved: #91190
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Stack from ghstack (oldest at bottom):
group_norm[fwd+backward]andmean_var(take 2) #91190Use Prims to implement group_norm, group_norm_backward and mean_var
Use
torch._ops.opsinstead oftorch.opsin numerous subpackages inorder to be able to make them importable from
torch/backend/mps/__init__.pyas this alias is defined inpytorch/torch/__init__.py
Line 1095 in 15af4b1
is executed last during init process.
Add
__all__totorch/backends/mps/__init__.pyas well as alias all imports as privateAdd
TestNNMPS.test_group_norm_backwardthat validates no NaNs are generated during the backward passFixes #88331