-
Notifications
You must be signed in to change notification settings - Fork 26.3k
[Quant][test] Added test to check if fp16 packing->unpacking yields the same result as to(torch.float16).to(torch.float32) [reland] #73808
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…he same result as to(torch.float16).to(torch.float32) [reland] Summary: A test was added in test_quantized_op.py that checks whether the fp16 packing and subsequent unpacking of a given fp32 tensor produces the same result as to(torch.float16).to(torch.float32) Test Plan: in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_pack_unpack_fp16 ``` in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_pack_unpack_fp16 ``` Reviewed By: jerryzh168 Pulled By: dzdang fbshipit-source-id: 5da453e5db4801dde196424282140726c8a4ef1f (cherry picked from commit ac8910e) ghstack-source-id: 29eee78 [ghstack-poisoned]
CI Flow Status⚛️ CI FlowRuleset - Version:
|
…he same result as to(torch.float16).to(torch.float32) [reland] Summary: A test was added in test_quantized_op.py that checks whether the fp16 packing and subsequent unpacking of a given fp32 tensor produces the same result as to(torch.float16).to(torch.float32) Test Plan: in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_pack_unpack_fp16 ``` in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_pack_unpack_fp16 ``` Reviewed By: jerryzh168 Pulled By: dzdang fbshipit-source-id: 5da453e5db4801dde196424282140726c8a4ef1f (cherry picked from commit ac8910e) ghstack-source-id: 29eee78 ghstack-source-id: 29eee78 Pull Request resolved: #73808
🔗 Helpful links
💊 CI failures summary and remediationsAs of commit 27211f7 (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Please report bugs/suggestions to the (internal) Dr. CI Users group. |
|
@dzdang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
…ng yields the same result as to(torch.float16).to(torch.float32) [reland]" Summary: A test was added in test_quantized_op.py that checks whether the fp16 packing and subsequent unpacking of a given fp32 tensor produces the same result as to(torch.float16).to(torch.float32) Test Plan: in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_pack_unpack_fp16 ``` in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_pack_unpack_fp16 ``` Reviewed By: jerryzh168 Pulled By: dzdang fbshipit-source-id: 5da453e5db4801dde196424282140726c8a4ef1f (cherry picked from commit ac8910e) Differential Revision: [D34650372](https://our.internmc.facebook.com/intern/diff/D34650372) [ghstack-poisoned]
…he same result as to(torch.float16).to(torch.float32) [reland] Summary: A test was added in test_quantized_op.py that checks whether the fp16 packing and subsequent unpacking of a given fp32 tensor produces the same result as to(torch.float16).to(torch.float32) Test Plan: in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_pack_unpack_fp16 ``` in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_pack_unpack_fp16 ``` Reviewed By: jerryzh168 Pulled By: dzdang fbshipit-source-id: 5da453e5db4801dde196424282140726c8a4ef1f (cherry picked from commit ac8910e) ghstack-source-id: f6bde02 ghstack-source-id: f6bde02 Pull Request resolved: #73808
…ng yields the same result as to(torch.float16).to(torch.float32) [reland]" Summary: A test was added in test_quantized_op.py that checks whether the fp16 packing and subsequent unpacking of a given fp32 tensor produces the same result as to(torch.float16).to(torch.float32) Test Plan: in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_pack_unpack_fp16 ``` in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_pack_unpack_fp16 ``` Reviewed By: jerryzh168 Pulled By: dzdang fbshipit-source-id: 5da453e5db4801dde196424282140726c8a4ef1f (cherry picked from commit ac8910e) Differential Revision: [D34650372](https://our.internmc.facebook.com/intern/diff/D34650372) [ghstack-poisoned]
…he same result as to(torch.float16).to(torch.float32) [reland] Summary: A test was added in test_quantized_op.py that checks whether the fp16 packing and subsequent unpacking of a given fp32 tensor produces the same result as to(torch.float16).to(torch.float32) Test Plan: in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_pack_unpack_fp16 ``` in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_pack_unpack_fp16 ``` Reviewed By: jerryzh168 Pulled By: dzdang fbshipit-source-id: 5da453e5db4801dde196424282140726c8a4ef1f (cherry picked from commit ac8910e) ghstack-source-id: 74934b4 ghstack-source-id: 74934b4 Pull Request resolved: #73808
|
@dzdang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
…ng yields the same result as to(torch.float16).to(torch.float32) [reland]" Summary: A test was added in test_quantized_op.py that checks whether the fp16 packing and subsequent unpacking of a given fp32 tensor produces the same result as to(torch.float16).to(torch.float32) Test Plan: in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_linear_prepack_fp16_numerics ``` Reviewed By: jerryzh168 Pulled By: dzdang fbshipit-source-id: 5da453e5db4801dde196424282140726c8a4ef1f (cherry picked from commit ac8910e) Differential Revision: [D34650372](https://our.internmc.facebook.com/intern/diff/D34650372) [ghstack-poisoned]
…he same result as to(torch.float16).to(torch.float32) [reland] Summary: A test was added in test_quantized_op.py that checks whether the fp16 packing and subsequent unpacking of a given fp32 tensor produces the same result as to(torch.float16).to(torch.float32) Test Plan: in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_linear_prepack_fp16_numerics ``` Reviewed By: jerryzh168 Pulled By: dzdang fbshipit-source-id: 5da453e5db4801dde196424282140726c8a4ef1f (cherry picked from commit ac8910e) ghstack-source-id: 69c9ce7 ghstack-source-id: 69c9ce7 Pull Request resolved: #73808
|
@dzdang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
…he same result as to(torch.float16).to(torch.float32) [reland] (#73808) Summary: Pull Request resolved: #73808 A test was added in test_quantized_op.py that checks whether the fp16 packing and subsequent unpacking of a given fp32 tensor produces the same result as to(torch.float16).to(torch.float32) Test Plan: in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_linear_prepack_fp16_numerics ``` in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_pack_unpack_fp16 ``` in pytorch main directory, execute ``` python test/test_quantization.py TestDynamicQuantizedOps.test_pack_unpack_fp16 ``` Imported from OSS Pulled By: dzdang fbshipit-source-id: 5da453e5db4801dde196424282140726c8a4ef1f (cherry picked from commit ac8910e) dzdang Differential Revision: D34650372 D34650372 fbshipit-source-id: f58a2a964ddf259af3909b0e961d0b0dfd0e35e2
|
Hey @dzdang. |
Stack from ghstack:
Summary:
A test was added in test_quantized_op.py that checks whether the fp16 packing and subsequent unpacking of a given fp32 tensor produces the same result as to(torch.float16).to(torch.float32)
Test Plan:
in pytorch main directory, execute
Reviewed By: jerryzh168
Pulled By: dzdang
fbshipit-source-id: 5da453e5db4801dde196424282140726c8a4ef1f
(cherry picked from commit ac8910e)
Differential Revision: D34650372