Skip to content

Conversation

@supriyar
Copy link
Contributor

@supriyar supriyar commented Sep 4, 2020

Stack from ghstack:

Summary:
Use existing embedding_bag operator but set offsets to [0, 1, .. len(indices)]
Test Plan:
python test/test_quantization.py TestEmbeddingOps.test_embedding_byte

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: D23547385

Summary:
Use existing embedding_bag operator but set offsets to [0, 1, .. len(indices)]
Test Plan:
python test/test_quantization.py TestEmbeddingOps.test_embedding_byte

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@dr-ci
Copy link

dr-ci bot commented Sep 4, 2020

💊 CI failures summary and remediations

As of commit 3e137ea (more details on the Dr. CI page):


  • 1/1 failures possibly* introduced in this PR
    • 1/1 non-CircleCI failure(s)

ci.pytorch.org: 1 failed


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 12 times.

Summary:
Use existing embedding_bag operator but set offsets to [0, 1, .. len(indices)]
Test Plan:
python test/test_quantization.py TestEmbeddingOps.test_embedding_byte

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
qresult = quant_op(packed_weight, indices, sparse=False)

ref = torch.embedding(weights, indices, padding_idx=-1, scale_grad_by_freq=False, sparse=False)
torch.testing.assert_allclose(ref, qresult, atol=0.005, rtol=1e-3)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are the modified atol and rtol values only because of quantization error?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes

packed_weight = prepack_op(qweight)
qresult = quant_op(packed_weight, indices, sparse=False)

ref = torch.embedding(weights, indices, padding_idx=-1, scale_grad_by_freq=False, sparse=False)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you use qweight.dequantize(), this should be an exact match

Summary:
Use existing embedding_bag operator but set offsets to [0, 1, .. len(indices)]
Test Plan:
python test/test_quantization.py TestEmbeddingOps.test_embedding_byte

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23547385](https://our.internmc.facebook.com/intern/diff/D23547385)

[ghstack-poisoned]
@codecov
Copy link

codecov bot commented Sep 8, 2020

Codecov Report

Merging #44207 into gh/supriyar/173/base will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@                  Coverage Diff                  @@
##           gh/supriyar/173/base   #44207   +/-   ##
=====================================================
  Coverage                 69.24%   69.24%           
=====================================================
  Files                       381      381           
  Lines                     47573    47573           
=====================================================
  Hits                      32943    32943           
  Misses                    14630    14630           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update cce5982...3e137ea. Read the comment docs.

Summary:
Use existing embedding_bag operator but set offsets to [0, 1, .. len(indices)]
Test Plan:
python test/test_quantization.py TestEmbeddingOps.test_embedding_byte

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: [D23547385](https://our.internmc.facebook.com/intern/diff/D23547385)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 6013a29.

@facebook-github-bot facebook-github-bot deleted the gh/supriyar/173/head branch September 12, 2020 14:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants