Skip to content

Conversation

@HDCharles
Copy link
Contributor

@HDCharles HDCharles commented Mar 28, 2022

Stack from ghstack (oldest at bottom):

Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: D35240273

Summary:

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Mar 28, 2022

🔗 Helpful links

💊 CI failures summary and remediations

As of commit a42631a (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

Summary:

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@HDCharles HDCharles changed the title Composability of sparsity and fusion [ao][sparsity] Composability of fusion and sparsity Mar 28, 2022
Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]

# from torch.ao.quantization.utils import nonparam_type
from torch.nn.utils.parametrize import is_parametrized
def nonparam_type(module):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same as previous PR, type_before_parametrizations?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@albanD albanD removed their request for review March 29, 2022 18:09
Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@HDCharles
Copy link
Contributor Author

@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

import torch
from torch.nn import Conv1d, Conv2d, Conv3d, ReLU, Linear, BatchNorm1d, BatchNorm2d, BatchNorm3d

# from torch.ao.quantization.utils import type_before_parametrizations
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273)

[ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273)

[ghstack-poisoned]
@HDCharles
Copy link
Contributor Author

@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273)

[ghstack-poisoned]
@HDCharles
Copy link
Contributor Author

@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273)

[ghstack-poisoned]
@HDCharles
Copy link
Contributor Author

@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273)

[ghstack-poisoned]
@HDCharles
Copy link
Contributor Author

@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

1 similar comment
@HDCharles
Copy link
Contributor Author

@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273)

[ghstack-poisoned]
@HDCharles
Copy link
Contributor Author

@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273)

[ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273)

[ghstack-poisoned]
Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273)

[ghstack-poisoned]
@HDCharles
Copy link
Contributor Author

@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273)

[ghstack-poisoned]
@HDCharles
Copy link
Contributor Author

@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273)

[ghstack-poisoned]
@HDCharles
Copy link
Contributor Author

@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273)

[ghstack-poisoned]
@HDCharles
Copy link
Contributor Author

@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Summary: Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan: python test/test_ao_sparsity.py TestComposability

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D35240273](https://our.internmc.facebook.com/intern/diff/D35240273)

[ghstack-poisoned]
@HDCharles
Copy link
Contributor Author

@HDCharles has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

facebook-github-bot pushed a commit that referenced this pull request Apr 8, 2022
Summary:
Pull Request resolved: #74847

Similar to the other PRs in this stack, the main problem was
that fusion needed to detect the original module type of parametrized
module when sparse prepare was called before fusion. In addition, there
was a potential issue with fusion before sparse_prepare but after the
sparse_config is created. However, in practice fusion moves the
references to the original modules into the fused module without issue.
Thus the original sparse_config that pointed to the original modules
gets automatically updated. If the fusion method changes this may cause
an issue since no explicit handling or updating of these pointers was
needed.

Test Plan:
python test/test_ao_sparsity.py TestComposability

Imported from OSS

Reviewed By: vkuzo, andrewor14, jerryzh168

Differential Revision: D35240273

fbshipit-source-id: 62ed66689b285c3fa68f1e149266ab877f1cdd8e
@github-actions
Copy link
Contributor

github-actions bot commented Apr 8, 2022

Hey @HDCharles.
You've committed this PR, but it does not have both a 'release notes: ...' and 'topics: ...' label. Please add one of each to the PR. The 'release notes: ...' label should represent the part of PyTorch that this PR changes (fx, autograd, distributed, etc) and the 'topics: ...' label should represent the kind of PR it is (not user facing, new feature, bug fix, perf improvement, etc). The list of valid labels can be found here for the 'release notes: ...' and here for the 'topics: ...'.
For changes that are 'topic: not user facing' there is no need for a release notes label.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants