Skip to content

Commit c2561b1

Browse files
committed
Replace Ranger optimizer with RAdam
1 parent 1820f46 commit c2561b1

File tree

3 files changed

+2
-187
lines changed

3 files changed

+2
-187
lines changed

README.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,5 +28,3 @@ Then, go to http://localhost:6006/
2828

2929
# Acknowledgements
3030
* Training code is based on https://github.com/glinscott/nnue-pytorch
31-
* Ranger optimizer is taken from https://github.com/lessw2020/Ranger-Deep-Learning-Optimizer
32-

ranger.py

Lines changed: 0 additions & 182 deletions
This file was deleted.

train.py

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,6 @@
22
import model as M
33
import nnue_dataset
44
import torch
5-
import ranger
65
import time
76
import os.path
87
from datetime import timedelta
@@ -149,8 +148,8 @@ def main(args):
149148
nnue = M.NNUE().to(main_device)
150149

151150
# Configure optimizer
152-
optimizer = ranger.Ranger(nnue.parameters(), lr=1e-3)
153-
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(optimizer, mode='min', factor=0.1, patience=1, verbose=True, min_lr=1e-6)
151+
optimizer = torch.optim.RAdam(nnue.parameters(), lr=1e-3, betas=(.95, 0.999), eps=1e-5, weight_decay=0)
152+
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(optimizer, mode='min', factor=0.3, patience=1, verbose=True, min_lr=1e-6)
154153

155154
# Main training loop
156155
start = time.monotonic()

0 commit comments

Comments
 (0)