-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Make Adadelta,Adagrad & Adamax differentiable
#86096
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/86096
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit d9e91d8: This comment was automatically generated by Dr. CI and updates every 15 minutes. |
Adadelta & Adagrad differentiable
|
/easycla As part of the transition to the PyTorch Foundation, this project now requires contributions be covered under the new CLA. See #85559 for additional details. This comment will trigger a new check of this PR. If you are already covered, you will simply see a new "EasyCLA" check that passes. If you are not covered, a bot will leave a new comment with a link to sign. |
6fca7eb to
54475d5
Compare
Adadelta & Adagrad differentiableAdadelta,Adagrad & Adamax differentiable
b1fb0a1 to
69a875d
Compare
torch/optim/adamax.py
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is the distinction here to maintain functional-ism/no mutations?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is to avoid an in-place modification of exp_inf that will trigger autograd errors :)
|
@emcastillo this change looks good overall! Could you sign the CLA + rebase your PR to make the pr_sanity_check green? The XLA failure looks unrelated, so rerunning to make sure that's the case. |
|
Yeah, I am waiting for my company to fill the corporate CLA, so it will take a few days :) |
|
could you rebase locally to see if you could clear the other two CI failures? you can also comment “@pytorchbot rebase” to rebase onto our stable viable/strict branch |
69a875d to
ed11691
Compare
|
Nice! This PR would be good to go once your company gets the CLA done. |
ed11691 to
d9e91d8
Compare
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Continuing the differentiable optimizers support