-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Complex gradcheck logic #43208
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Complex gradcheck logic #43208
Conversation
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit 87ed330 (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 1 job timed out:
This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group. This comment has been revised 265 times. |
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
To unblock #43208, which adds "is_complex" checks to backward formulas that are being tested for batched gradient support with vmap. Test Plan: - `pytest test/test_vmap.py -v` Differential Revision: [D23685356](https://our.internmc.facebook.com/intern/diff/D23685356) [ghstack-poisoned]
To unblock #43208, which adds "is_complex" checks to backward formulas that are being tested for batched gradient support with vmap. Test Plan: - `pytest test/test_vmap.py -v` Differential Revision: [D23685356](https://our.internmc.facebook.com/intern/diff/D23685356) [ghstack-poisoned]
To unblock #43208, which adds "is_complex" checks to backward formulas that are being tested for batched gradient support with vmap. Test Plan: - `pytest test/test_vmap.py -v` Differential Revision: [D23685356](https://our.internmc.facebook.com/intern/diff/D23685356) [ghstack-poisoned]
This PR adds gradcheck for complex. The logic used for complex gradcheck is described in Section 3.5.3 here: https://arxiv.org/pdf/1701.00392.pdf More concretely, this PR introduces the following changes: 1. Updates get_numerical_jacobian to take as input a scalar value for vector (v). Adds gradcheck logic for C -> C, C-> R, R -> C. For R -> C functions, only the real value of gradient is propagated. 2. Adds backward definition for `torch.complex` and also adds a test to verify the definition added. 3. Updates backward for `mul`, `sin`, `cos`, `sinh`, `cosh`. 4. Adds tests for all `torch.real`, `torch.imag`, `torch.view_as_real`, `torch.view_as_complex`, `torch.conj`. Follow up tasks: - [ ] Add more thorough tests for R -> C cases. Specifically, add R->C test variants for functions. for e.g., `torch.mul(complex_tensor, real_tensor)` (#44744 ) - [ ] Add back commented test in `common_methods_invocation.py`. ( #43208 ) - [ ] Add more special case checking for complex gradcheck to make debugging easier. - [ ] Update complex autograd note. - [ ] disable complex autograd for operators not tested for complex. - [ ] Re-enable tests in `test_ops.py` for complex dtype. ( #43208 ) - [ ] Re-enable `TestGradCheckOverride.test_gradcheck` cc. @hameerabbasi Differential Revision: [D23655088](https://our.internmc.facebook.com/intern/diff/D23655088) [ghstack-poisoned]
This PR adds gradcheck for complex. The logic used for complex gradcheck is described in Section 3.5.3 here: https://arxiv.org/pdf/1701.00392.pdf More concretely, this PR introduces the following changes: 1. Updates get_numerical_jacobian to take as input a scalar value for vector (v). Adds gradcheck logic for C -> C, C-> R, R -> C. For R -> C functions, only the real value of gradient is propagated. 2. Adds backward definition for `torch.complex` and also adds a test to verify the definition added. 3. Updates backward for `mul`, `sin`, `cos`, `sinh`, `cosh`. 4. Adds tests for all `torch.real`, `torch.imag`, `torch.view_as_real`, `torch.view_as_complex`, `torch.conj`. Follow up tasks: - [ ] Add more thorough tests for R -> C cases. Specifically, add R->C test variants for functions. for e.g., `torch.mul(complex_tensor, real_tensor)` (#44744 ) - [ ] Add back commented test in `common_methods_invocation.py`. ( #43208 ) - [ ] Add more special case checking for complex gradcheck to make debugging easier. - [ ] Update complex autograd note. - [ ] disable complex autograd for operators not tested for complex. - [ ] Re-enable tests in `test_ops.py` for complex dtype. ( #43208 ) - [ ] Re-enable `TestGradCheckOverride.test_gradcheck` cc. @hameerabbasi Differential Revision: [D23655088](https://our.internmc.facebook.com/intern/diff/D23655088) [ghstack-poisoned]
| # or pure imaginary delta | ||
| def compute_gradient(delta): | ||
| # we currently assume that the norm of delta equals eps | ||
| assert(delta == eps or delta == (eps * 1j)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc. @ngimel
Summary: Pull Request resolved: #44649 To unblock #43208, which adds "is_complex" checks to backward formulas that are being tested for batched gradient support with vmap. Test Plan: - `pytest test/test_vmap.py -v` Reviewed By: anjali411 Differential Revision: D23685356 Pulled By: zou3519 fbshipit-source-id: 29e41a9296336f6d1008e3040cade4c643bf5ebf
This PR adds gradcheck for complex. The logic used for complex gradcheck is described in Section 3.5.3 here: https://arxiv.org/pdf/1701.00392.pdf More concretely, this PR introduces the following changes: 1. Updates get_numerical_jacobian to take as input a scalar value for vector (v). Adds gradcheck logic for C -> C, C-> R, R -> C. For R -> C functions, only the real value of gradient is propagated. 2. Adds backward definition for `torch.complex` and also adds a test to verify the definition added. 3. Updates backward for `mul`, `sin`, `cos`, `sinh`, `cosh`. 4. Adds tests for all `torch.real`, `torch.imag`, `torch.view_as_real`, `torch.view_as_complex`, `torch.conj`. Follow up tasks: - [ ] Add more thorough tests for R -> C cases. Specifically, add R->C test variants for functions. for e.g., `torch.mul(complex_tensor, real_tensor)` (#44744 ) - [ ] Add back commented test in `common_methods_invocation.py`. ( #43208 ) - [ ] Add more special case checking for complex gradcheck to make debugging easier. - [ ] Update complex autograd note. - [ ] disable complex autograd for operators not tested for complex. - [ ] Re-enable tests in `test_ops.py` for complex dtype. ( #43208 ) - [ ] Re-enable `TestGradCheckOverride.test_gradcheck` cc. @hameerabbasi Differential Revision: [D23655088](https://our.internmc.facebook.com/intern/diff/D23655088) [ghstack-poisoned]
This PR adds gradcheck for complex. The logic used for complex gradcheck is described in Section 3.5.3 here: https://arxiv.org/pdf/1701.00392.pdf More concretely, this PR introduces the following changes: 1. Updates get_numerical_jacobian to take as input a scalar value for vector (v). Adds gradcheck logic for C -> C, C-> R, R -> C. For R -> C functions, only the real value of gradient is propagated. 2. Adds backward definition for `torch.complex` and also adds a test to verify the definition added. 3. Updates backward for `mul`, `sin`, `cos`, `sinh`, `cosh`. 4. Adds tests for all `torch.real`, `torch.imag`, `torch.view_as_real`, `torch.view_as_complex`, `torch.conj`. Follow up tasks: - [ ] Add more thorough tests for R -> C cases. Specifically, add R->C test variants for functions. for e.g., `torch.mul(complex_tensor, real_tensor)` (#44744 ) - [ ] Add back commented test in `common_methods_invocation.py`. ( #43208 ) - [ ] Add more special case checking for complex gradcheck to make debugging easier. - [ ] Update complex autograd note. - [ ] disable complex autograd for operators not tested for complex. - [ ] Re-enable tests in `test_ops.py` for complex dtype. ( #43208 ) - [ ] Re-enable `TestGradCheckOverride.test_gradcheck` cc. @hameerabbasi Differential Revision: [D23655088](https://our.internmc.facebook.com/intern/diff/D23655088) [ghstack-poisoned]
Summary: Pull Request resolved: #44649 To unblock #43208, which adds "is_complex" checks to backward formulas that are being tested for batched gradient support with vmap. Test Plan: - `pytest test/test_vmap.py -v` Reviewed By: anjali411 Differential Revision: D23685356 Pulled By: zou3519 fbshipit-source-id: 29e41a9296336f6d1008e3040cade4c643bf5ebf
|
@anjali411 merged this pull request in 9f67176. |
|
@hameerabbasi this PR has been merged so would you like to create a follow-up PR to re-enable |
|
I'll work on that soon, thanks. |
ghstack-source-id: fdc0e91 Pull Request resolved: pytorch/pytorch#43208
Hi @hameerabbasi just checking in if you are still planning to work on re-enabling |
|
I apologize @anjali411, that fell off the radar due to a personal emergency. I'll add it to my to-do list so it's no longer just in my head. |
no worries and hope all is well! feel free to add me as a reviewer once you have a PR ready with your changes. |
|
@hameerabbasi gentle ping for re-enabling |
|
There's a PR up that I submitted yesterday, #45732. Requested your review on it. |
thanks @hameerabbasi it looks great. I totally missed your PR before, but looks like @albanD is already reviewing it :D |
Stack from ghstack:
This PR adds gradcheck for complex. The logic used for complex gradcheck is described in Section 3.5.3 here: https://arxiv.org/pdf/1701.00392.pdf
More concretely, this PR introduces the following changes:
torch.complexand also adds a test to verify the definition added.mul,sin,cos,sinh,cosh.torch.real,torch.imag,torch.view_as_real,torch.view_as_complex,torch.conj.Follow up tasks:
Add more thorough tests for R -> C cases. Specifically, add R->C test variants for functions. for e.g.,
torch.mul(complex_tensor, real_tensor)([WIP] Add tests for R to C cases #44744 )Add back commented test in
common_methods_invocation.py. ( Update backward definition for more operators and reenable tests in test_ops.py #44444 )Add more special case checking for complex gradcheck to make debugging easier.
Update complex autograd note.
disable complex autograd for operators not tested for complex. (Add allowlist for complex backward #45461 )
Re-enable tests in
test_ops.pyfor complex dtype. ( Update backward definition for more operators and reenable tests in test_ops.py #44444 )Re-enable
TestGradCheckOverride.test_gradcheckcc. @hameerabbasi (Allow Tensor-likes in torch.autograd.gradcheck #45732 )Differential Revision: D23655088