Skip to content

Conversation

@lazypanda1
Copy link
Contributor

This PR adds a check to prevent division by zero errors and give users friendlier error messages when using the Adam optimizer.

Currently, if one specifies the beta value of the Adam optimizer as 1.0 for the first parameter, the program fails with the error message, ZeroDivisionError: float division by zero. According to the definition of Adam, beta values should be in the range [0, 1).

Also referring #751 where I encountered and tried to fix this issue in the first place. I believe this would benefit all clients using pytorch as backend.

torch.randn(10),
torch.randn(5),
constructor
)

This comment was marked as off-topic.

lambda weight, bias: optim.Adam(
[weight, bias], lr=1e-2, betas=(1.0, 0.0), amsgrad=True),
"Invalid beta parameter at index 0: 1.0"
)

This comment was marked as off-topic.

@lazypanda1
Copy link
Contributor Author

@apaszke I have reduced the test. Please review

@apaszke
Copy link
Contributor

apaszke commented Feb 11, 2018

@pytorchbot test this please

@soumith soumith merged commit a061000 into pytorch:master Feb 12, 2018
@soumith
Copy link
Contributor

soumith commented Feb 12, 2018

thank you @lazypanda1 !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants