Skip to content

Commit 6eb3969

Browse files
tianzhi0549facebook-github-bot
authored andcommitted
keep reuqires_grad unchanged after converting bn to syncbn (#22569)
Summary: After converting BN layers to SyncBN layers, the function will set all `requires_grad = True` regardless of the original requires_grad states. I think it is a bug and have fixed it in this PR. Pull Request resolved: #22569 Differential Revision: D16151647 Pulled By: zou3519 fbshipit-source-id: e2ad1886c94d8882485e7fb8be51ad76469ecc67
1 parent cbb0b81 commit 6eb3969

File tree

1 file changed

+3
-0
lines changed

1 file changed

+3
-0
lines changed

torch/nn/modules/batchnorm.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -493,6 +493,9 @@ def convert_sync_batchnorm(cls, module, process_group=None):
493493
if module.affine:
494494
module_output.weight.data = module.weight.data.clone().detach()
495495
module_output.bias.data = module.bias.data.clone().detach()
496+
# keep reuqires_grad unchanged
497+
module_output.weight.requires_grad = module.weight.requires_grad
498+
module_output.bias.requires_grad = module.bias.requires_grad
496499
module_output.running_mean = module.running_mean
497500
module_output.running_var = module.running_var
498501
module_output.num_batches_tracked = module.num_batches_tracked

0 commit comments

Comments
 (0)