[ONNX] Create fake implementations for onnx ops; fix boolean mask in attention#165780
[ONNX] Create fake implementations for onnx ops; fix boolean mask in attention#165780justinchuby wants to merge 13 commits intopytorch:mainfrom
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/165780
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 9fba564 with merge base b31bad1 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
cc @jambayk |
|
@pytorchbot rebase |
|
@pytorchbot started a rebase job onto refs/remotes/origin/viable/strict. Check the current status here |
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
|
Successfully rebased |
c88c97e to
22a148c
Compare
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…attention (#165780) Previously we rely on the concreate implementation to generate fake implementation. This makes the fake implementation overly complicated and breaks in some cases when there are dynamic shapes. This PR updates onnx op registration to instead take a dedicated fake implementation. **Also fixed: When boolean mask is supplied to torch sdpa, it was previously taken the negation, which is incorrect.** Fix #164909 Also taken changes from #156635 Pull Request resolved: #165780 Approved by: https://github.com/titaiwangms
Previously we rely on the concreate implementation to generate fake implementation. This makes the fake implementation overly complicated and breaks in some cases when there are dynamic shapes.
This PR updates onnx op registration to instead take a dedicated fake implementation.
Also fixed: When boolean mask is supplied to torch sdpa, it was previously taken the negation, which is incorrect.
Fix #164909 Also taken changes from #156635
cc @titaiwangms