-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Add reset_grad() function (#42754) #44423
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
This pull request was exported from Phabricator. Differential Revision: D23010859 |
💊 CI failures summary and remediationsAs of commit 9b1dbe4 (more details on the Dr. CI page):
ci.pytorch.org: 1 failedThis comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group. This comment has been revised 15 times. |
|
This pull request was exported from Phabricator. Differential Revision: D23010859 |
25b1c02 to
4bf362c
Compare
4bf362c to
e6c213b
Compare
|
This pull request was exported from Phabricator. Differential Revision: D23010859 |
e6c213b to
7c84b0e
Compare
|
This pull request was exported from Phabricator. Differential Revision: D23010859 |
torch/nn/modules/module.py
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
``torch.optimizer``
torch/optim/optimizer.py
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
``zero_grad(set_to_none=True)`` followed by a backward pass, ``.grad``\ s
torch/optim/optimizer.py
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
``torch.optim``
Summary: Pull Request resolved: pytorch#44423 Pull Request resolved: pytorch#42754 Test Plan: Imported from OSS Reviewed By: mruberry Differential Revision: D23010859 Pulled By: ngimel fbshipit-source-id: 760279f7c9cb84d11bef51207c18bf1f362ca7ad
7c84b0e to
9b1dbe4
Compare
|
This pull request was exported from Phabricator. Differential Revision: D23010859 |
Codecov Report
@@ Coverage Diff @@
## master #44423 +/- ##
=======================================
Coverage 67.99% 67.99%
=======================================
Files 382 382
Lines 49385 49389 +4
=======================================
+ Hits 33579 33582 +3
- Misses 15806 15807 +1
Continue to review full report at Codecov.
|
Summary: https://pytorch.org/tutorials/recipes/recipes/amp_recipe.html is live. Core amp docs should reference it. Also i fixed some typos in the `zero_grad` docs we ignored when git was behaving weirdly during ngimel 's merge of #44423. Pull Request resolved: #44725 Reviewed By: mruberry Differential Revision: D23723807 Pulled By: ngimel fbshipit-source-id: ca0b76365f8ca908bd978e3b38bf81857fa6c2a3
Summary: https://pytorch.org/tutorials/recipes/recipes/amp_recipe.html is live. Core amp docs should reference it. Also i fixed some typos in the `zero_grad` docs we ignored when git was behaving weirdly during ngimel 's merge of #44423. Pull Request resolved: #44725 Reviewed By: mruberry Differential Revision: D23723807 Pulled By: ngimel fbshipit-source-id: ca0b76365f8ca908bd978e3b38bf81857fa6c2a3
Summary: Pull Request resolved: #42754
Test Plan: Imported from OSS
Differential Revision: D23010859
Pulled By: firstprayer