-
Notifications
You must be signed in to change notification settings - Fork 26.3k
[ao][sparsity] Composability of fusion and sparsity #74847
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Summary: Test Plan: Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
🔗 Helpful links
💊 CI failures summary and remediationsAs of commit a42631a (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Please report bugs/suggestions to the (internal) Dr. CI Users group. |
Summary: Test Plan: Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
torch/nn/intrinsic/modules/fused.py
Outdated
|
|
||
| # from torch.ao.quantization.utils import nonparam_type | ||
| from torch.nn.utils.parametrize import is_parametrized | ||
| def nonparam_type(module): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same as previous PR, type_before_parametrizations?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
|
@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
torch/nn/intrinsic/modules/fused.py
Outdated
| import torch | ||
| from torch.nn import Conv1d, Conv2d, Conv3d, ReLU, Linear, BatchNorm1d, BatchNorm2d, BatchNorm3d | ||
|
|
||
| # from torch.ao.quantization.utils import type_before_parametrizations |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273) [ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273) [ghstack-poisoned]
|
@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273) [ghstack-poisoned]
|
@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273) [ghstack-poisoned]
|
@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273) [ghstack-poisoned]
|
@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
1 similar comment
|
@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273) [ghstack-poisoned]
|
@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273) [ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273) [ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273) [ghstack-poisoned]
|
@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273) [ghstack-poisoned]
|
@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273) [ghstack-poisoned]
|
@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273) [ghstack-poisoned]
|
@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Summary: Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273) [ghstack-poisoned]
|
@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Summary: Pull Request resolved: #74847 Similar to the other PRs in this stack, the main problem was that fusion needed to detect the original module type of parametrized module when sparse prepare was called before fusion. In addition, there was a potential issue with fusion before sparse_prepare but after the sparse_config is created. However, in practice fusion moves the references to the original modules into the fused module without issue. Thus the original sparse_config that pointed to the original modules gets automatically updated. If the fusion method changes this may cause an issue since no explicit handling or updating of these pointers was needed. Test Plan: python test/test_ao_sparsity.py TestComposability Imported from OSS Reviewed By: vkuzo, andrewor14, jerryzh168 Differential Revision: D35240273 fbshipit-source-id: 62ed66689b285c3fa68f1e149266ab877f1cdd8e
|
Hey @HDCharles. |
Stack from ghstack (oldest at bottom):
Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.
Test Plan: python test/test_ao_sparsity.py TestComposability
Reviewers:
Subscribers:
Tasks:
Tags:
Differential Revision: D35240273