Skip to content

Conversation

@jerryzh168
Copy link
Contributor

@jerryzh168 jerryzh168 commented Sep 16, 2020

Stack from ghstack:

Summary:
This is for feature parity with fx graph mode quantization

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: D23745086

Summary:
This is for feature parity with fx graph mode quantization

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
Summary:
This is for feature parity with fx graph mode quantization

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
Summary:
This is for feature parity with fx graph mode quantization

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23745086](https://our.internmc.facebook.com/intern/diff/D23745086)

[ghstack-poisoned]
@dr-ci
Copy link

dr-ci bot commented Sep 16, 2020

💊 CI failures summary and remediations

As of commit 6e66c85 (more details on the Dr. CI page):


  • 1/1 failures possibly* introduced in this PR
    • 1/1 non-CircleCI failure(s)

ci.pytorch.org: 1 failed


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 27 times.

Summary:
This is for feature parity with fx graph mode quantization

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23745086](https://our.internmc.facebook.com/intern/diff/D23745086)

[ghstack-poisoned]
jerryzh168 added a commit that referenced this pull request Sep 16, 2020
Summary:
This is for feature parity with fx graph mode quantization

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

ghstack-source-id: 7b74091
Pull Request resolved: #44835
if type(mod) not in SWAPPABLE_MODULES:
# both swappable modules and observed custom modules are
# swapped as one unit
if type(mod) not in SWAPPABLE_MODULES and \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When we support custom module fusion, we will need to change code here. Currently, it looks like Swappable modules defines the list of fusions that are supported. Thoughts on how this can be addressed? There main use case would be : nn.module A + nn.module B => Custom Fused Module C

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can make this configuration with registrations as well, can be done in a separate PR before we need this feature

self.checkDynamicQuantizedModule(quantized_model.emb, torch.nn.quantized.EmbeddingBag, torch.quint8)

@skipIfNoFBGEMM
def test_custom_module_class(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similar request as in the previous PR, lets add tests for each of the workflows (PTQ, Dynamic and QAT) and also for modules with multiple input/outputs (LSTMs).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for dynamic/qat, will do that after the mixed mode support change.
lstm test will be done after zafar write the lstm pr. but I'll add multiple input/output support in a separate PR

Summary:
This is for feature parity with fx graph mode quantization

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23745086](https://our.internmc.facebook.com/intern/diff/D23745086)

[ghstack-poisoned]
Summary:
This is for feature parity with fx graph mode quantization

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23745086](https://our.internmc.facebook.com/intern/diff/D23745086)

[ghstack-poisoned]
Summary:
This is for feature parity with fx graph mode quantization

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23745086](https://our.internmc.facebook.com/intern/diff/D23745086)

[ghstack-poisoned]
jerryzh168 added a commit that referenced this pull request Sep 22, 2020
Summary:
This is for feature parity with fx graph mode quantization

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

ghstack-source-id: dc04c72
Pull Request resolved: #44835
Copy link

@z-a-f z-a-f left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

if type(mod) not in SWAPPABLE_MODULES:
# both swappable modules and observed custom modules are
# swapped as one unit
if type(mod) not in SWAPPABLE_MODULES and \
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: will the SWAPPABLE_MODULES be a globa variable, or are we planning on replacing it with a getter function?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah we can do that when there is a need

Copy link
Contributor

@raghuramank100 raghuramank100 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we can add more tests like requested in the other PR for both eager and IR custom module support, that would be great.

@codecov
Copy link

codecov bot commented Sep 23, 2020

Codecov Report

❗ No coverage uploaded for pull request base (gh/jerryzh168/437/base@e64f20f). Click here to learn what that means.
The diff coverage is n/a.

Impacted file tree graph

@@                    Coverage Diff                    @@
##             gh/jerryzh168/437/base   #44835   +/-   ##
=========================================================
  Coverage                          ?   67.92%           
=========================================================
  Files                             ?      385           
  Lines                             ?    50169           
  Branches                          ?        0           
=========================================================
  Hits                              ?    34079           
  Misses                            ?    16090           
  Partials                          ?        0           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update e64f20f...6e66c85. Read the comment docs.

Summary:
This is for feature parity with fx graph mode quantization

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23745086](https://our.internmc.facebook.com/intern/diff/D23745086)

[ghstack-poisoned]
jerryzh168 added a commit that referenced this pull request Sep 23, 2020
Summary:
This is for feature parity with fx graph mode quantization

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

ghstack-source-id: 0b5e297
Pull Request resolved: #44835
Summary:
This is for feature parity with fx graph mode quantization

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23745086](https://our.internmc.facebook.com/intern/diff/D23745086)

[ghstack-poisoned]
jerryzh168 added a commit that referenced this pull request Sep 23, 2020
Summary:
This is for feature parity with fx graph mode quantization

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

ghstack-source-id: 7a078ac
Pull Request resolved: #44835
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in f93ead6.

@facebook-github-bot facebook-github-bot deleted the gh/jerryzh168/437/head branch September 27, 2020 14:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants