[Autograd] Add Default Autograd Fallback for PrivateUse1 in PyTorch#165315
[Autograd] Add Default Autograd Fallback for PrivateUse1 in PyTorch#165315fffrog wants to merge 2 commits intogh/fffrog/179/basefrom
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/165315
Note: Links to docs will display an error until the docs builds have been completed. ⏳ No Failures, 22 PendingAs of commit 116fd45 with merge base e67e3d9 ( UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
Please refer to this [link](#163979) for more background. - Allow register fallback for AutogradPrivateUse1 multiple. - Add Autograd fallback implemetation for AutogradPrivateUse1 PyTorch can privide a common implementation for AutogradPrivateUse1, and the user can override it based on the need of specififc accelerator. ghstack-source-id: 279846c Pull-Request: #165315
|
@pytorchbot label "topic: not user facing" |
|
Deferring to Alban here, there was some discussion about this |
Got it, thank you. |
albanD
left a comment
There was a problem hiding this comment.
Sounds good as a first step.
Let's continue to chat on the issue for the inplace or view kernel.
Got it, thank you. |
|
Starting merge as part of PR stack under #165316 |
|
Rebased |
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
|
Starting merge as part of PR stack under #165316 |
…Private1 (#165316) As the title stated. The fallback for AutogradPrivateUse1 is builtin in PyTorch, so it is no need to register general implementation for out of tree backend. Pull Request resolved: #165316 Approved by: https://github.com/ezyang, https://github.com/albanD ghstack dependencies: #165315
Please refer to this [link](pytorch/pytorch#163979) for more background. - Allow register fallback for AutogradPrivateUse1 multiple. - Add Autograd fallback implemetation for AutogradPrivateUse1 PyTorch can privide a common implementation for AutogradPrivateUse1, and the user can override it based on the need of specififc accelerator. ghstack-source-id: 8bc59b0 Pull-Request: pytorch/pytorch#165315
Stack from ghstack (oldest at bottom):
Please refer to this link for more background.
PyTorch can privide a common implementation for AutogradPrivateUse1, and the user can override it based on the need of specififc accelerator.