Skip to content

Commit 082936f

Browse files
sampeposefacebook-github-bot
authored andcommitted
Clarify cycliclr param docs (#20880)
Summary: Pull Request resolved: #20880 This clarifies how the momentum parameters should be used. Reviewed By: soumith Differential Revision: D15482450 fbshipit-source-id: e3649a38876c5912cb101d8e404abca7c3431766
1 parent 68c3ef7 commit 082936f

File tree

1 file changed

+8
-3
lines changed

1 file changed

+8
-3
lines changed

torch/optim/lr_scheduler.py

Lines changed: 8 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -498,16 +498,21 @@ class CyclicLR(_LRScheduler):
498498
cycle_momentum (bool): If ``True``, momentum is cycled inversely
499499
to learning rate between 'base_momentum' and 'max_momentum'.
500500
Default: True
501-
base_momentum (float or list): Initial momentum which is the
502-
lower boundary in the cycle for each parameter group.
501+
base_momentum (float or list): Lower momentum boundaries in the cycle
502+
for each parameter group. Note that momentum is cycled inversely
503+
to learning rate; at the peak of a cycle, momentum is
504+
'base_momentum' and learning rate is 'max_lr'.
503505
Default: 0.8
504506
max_momentum (float or list): Upper momentum boundaries in the cycle
505507
for each parameter group. Functionally,
506508
it defines the cycle amplitude (max_momentum - base_momentum).
507509
The momentum at any cycle is the difference of max_momentum
508510
and some scaling of the amplitude; therefore
509511
base_momentum may not actually be reached depending on
510-
scaling function. Default: 0.9
512+
scaling function. Note that momentum is cycled inversely
513+
to learning rate; at the start of a cycle, momentum is 'max_momentum'
514+
and learning rate is 'base_lr'
515+
Default: 0.9
511516
last_epoch (int): The index of the last batch. This parameter is used when
512517
resuming a training job. Since `step()` should be invoked after each
513518
batch instead of after each epoch, this number represents the total

0 commit comments

Comments
 (0)