-
Notifications
You must be signed in to change notification settings - Fork 26.3k
[nn] module: full_backward_pre_hook #86700
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[nn] module: full_backward_pre_hook #86700
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/86700
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 0f4a32e: This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
|
||
| def register_module_backward_hook( | ||
| hook: Callable[['Module', _grad_t, _grad_t], Union[None, Tensor]] | ||
| hook: Callable[['Module', _grad_t, _grad_t], Union[None, _grad_t]] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the signature here was wrong.
Note: This change is not related to backward_pre_hook
|
|
||
| def register_module_full_backward_hook( | ||
| hook: Callable[['Module', _grad_t, _grad_t], Union[None, Tensor]] | ||
| hook: Callable[['Module', _grad_t, _grad_t], Union[None, _grad_t]] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the signature here was wrong.
Note: This change is not related to backward_pre_hook
albanD
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@soulitzer can you take a look at this?
soulitzer
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, look pretty good. Just have some small comments on the docs.
…lop/hooks/module-backward-pre-hook
| output = bn(torch.randn(5, 5, requires_grad=True)) | ||
| output.sum().backward() | ||
|
|
||
| @skipIfTorchDynamo("TorchDynamo does not work well with hooks") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fails on Dynamo. Have added skip similar to other hook related tests.
|
@soulitzer have addressed all the points. PTAL Thanks :) |
test/test_nn.py
Outdated
| output.sum().backward() | ||
|
|
||
| @skipIfTorchDynamo("TorchDynamo does not work well with hooks") | ||
| def test_hook_backward_pre_and_full(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: we might want to rename this test now that both tests are "full"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Makes sense. Renaming it to test_backward_hooks_interaction
soulitzer
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
|
@pytorchbot merge -g |
Merge startedYour change will be merged once all checks on your PR pass since you used the green (-g) flag (ETA: 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
|
Hey @kshitij12345. |
Fixes #42824