Skip to content

Commit 287aa76

Browse files
yf225soumith
authored andcommitted
Fix clip_grad_norm use for word_language_model example (#332)
1 parent de5ba9f commit 287aa76

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

word_language_model/main.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -160,7 +160,7 @@ def train():
160160
loss.backward()
161161

162162
# `clip_grad_norm` helps prevent the exploding gradient problem in RNNs / LSTMs.
163-
torch.nn.utils.clip_grad_norm(model.parameters(), args.clip)
163+
torch.nn.utils.clip_grad_norm_(model.parameters(), args.clip)
164164
for p in model.parameters():
165165
p.data.add_(-lr, p.grad.data)
166166

0 commit comments

Comments
 (0)