-
Notifications
You must be signed in to change notification settings - Fork 26.3k
[FX][Quant] Enable FX quant for patterns like x.view(x.size(...), ...) #90001
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/90001
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 5adc1b8: This comment was automatically generated by Dr. CI and updates every 15 minutes. |
jgong5
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. A separate question on the backend configuration. It seems awkward to me that we have to add such a common configuration to all the backends. Maybe we can have a group of common configurations that can be applied directly to all the backends?
Yes. Maybe we do it in another PR. |
45a923e to
0d9ab14
Compare
0d9ab14 to
4752c80
Compare
|
Hi @vkuzo @jerryzh168 @z-a-f Could you help review this? Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: this can also be:
if not is_get_tensor_info_node(n):
continue
to reduce indentation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK. It's fixed
jerryzh168
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, maybe ask @vkuzo to take a look as well since he added the original PR
edb212b to
9d6abd7
Compare
9d6abd7 to
9a5138b
Compare
|
Hi @Xia-Weiwen any plans to land this? |
Hi @jerryzh168. This PR was waiting for #91297 to be landed first. That's why it is still not landed. I was trying to land #91297 but some unrelated CI checks failed. I will land these two PRs today if no further CI issues. |
9a5138b to
30d6b07
Compare
The latest master branch is causing a lot of CI failures right now. Probably due to this commit 46f16b9. I will merge this tomorrow if CI checks pass. |
30d6b07 to
5adc1b8
Compare
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Summary
This work continues with #83784 by @vkuzo and includes all the changes in that PR.
Quote from #83784:
Note: This PR needs #91297 to be landed first otherwise there is a UT failure.
Test plan
cc @jgong5 @mingfeima @XiaobingSuper @sanchitintel @ashokei @jingxu10