Skip to content

Commit 2ac9abf

Browse files
jerry73204facebook-github-bot
authored andcommitted
Fix memory leak in Adam, Adagrad, RMSProp (#23125)
Summary: As reported in LaurentMazare/tch-rs#76, the memory grows when weight_decay is present when using Adam. It applies the same fix in #23007 to Adam, Adagrad and RMSProp. Pull Request resolved: #23125 Differential Revision: D16402421 Pulled By: soumith fbshipit-source-id: 59eb4bd81b8bd9e1a5f7c068ed841f70a4c38a80
1 parent 96b6797 commit 2ac9abf

File tree

3 files changed

+3
-0
lines changed

3 files changed

+3
-0
lines changed

torch/csrc/api/src/optim/adagrad.cpp

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,7 @@ void Adagrad::step() {
2424
}
2525

2626
if (options.weight_decay_ > 0) {
27+
NoGradGuard guard;
2728
p.grad() = p.grad() + options.weight_decay_ * p;
2829
}
2930

torch/csrc/api/src/optim/adam.cpp

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,7 @@ void Adam::step() {
2323
}
2424

2525
if (options.weight_decay_ > 0) {
26+
NoGradGuard guard;
2627
p.grad() = p.grad() + options.weight_decay_ * p;
2728
}
2829

torch/csrc/api/src/optim/rmsprop.cpp

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,7 @@ void RMSprop::step() {
2424
}
2525

2626
if (options.weight_decay_ > 0) {
27+
NoGradGuard guard;
2728
p.grad() = p.grad() + options.weight_decay_ * p;
2829
}
2930

0 commit comments

Comments
 (0)