-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Implement logsumexp (fixes #2591) #7254
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
ssnl
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! Can you add inf cases in the test? This looks great otherwise.
ssnl
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good to merge if test passes
soumith
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
simon reviewed, so accepting
ssnl
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually could you write a custom backward? The autograd backward isn't very efficient. The formula should just be exp(input) / output for 1d case.
|
Windows error seems legit, if a bit strange: |
|
I looked at the code and I don't see why this test would have triggered only on Windows. Looks like it will need debugging. Possibly an error elsewhere in PyTorch that is being exercised here. |
|
@ezyang Thank you for looking into that. Interestingly, I got failures on Linux (gcc 7.3) as well when I wanted to test the backward - incidentally it didn't yesterday before I did the backward. It turned out that I had forgotten to specify |
|
@pytorchbot retest this please |
|
This needs rebasing |
Thank you Simon and Edward for your comments!
|
@pytorchbot retest this please |
|
@ezyang I think now the PR has two spurious failures instead of one. |
|
@pytorchbot retest this please |
|
Thank you! There seems to be a different failure now, but I'm afraid I'm not sure what is going on. |
|
@pytorchbot retest this please |
|
thanks @t-vi for the contribution! |
|
@t-vi @zou3519 The doc-string isn't shown in docs online: empty https://pytorch.org/docs/master/search.html?q=logsumexp&check_keywords=yes&area=default Maybe Also, it seems PyTorch now has two copies of |
* Implement logsumexp (fixes pytorch#2591) * Add logsumexp_backward, fix _out declaration. Thank you Simon and Edward for your comments!
Maybe this is useful.