-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Add CELU activation to pytorch #8551
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Just one unrelated fail, should be working now. |
|
Just some information: Changes in this PR is very similar to #4269 and b6a30f7 by @colesbury |
|
Thanks for your contribution! Were you able to compile locally? It seems to fail to compile on our CI machines. |
|
@ssnl There was a test added in master branch that break the build of this after merging. It is fixed now, and this PR is ready for review. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Misclicked land. Fortunately it didn't go through.
aten/src/ATen/native/Activation.cpp
Outdated
| } | ||
|
|
||
| Tensor celu(const Tensor & self, Scalar alpha) { | ||
| Tensor inv_alpha = alpha.toTensor().reciprocal(); |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
houseroad
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ONNX part LG to me.
Nit: add test case in https://github.com/pytorch/pytorch/blob/master/test/onnx/test_operators.py to cover Elu
| r"""Applies element-wise, | ||
| :math:`\text{CELU}(x) = \max(0,x) + \min(0, \alpha * (\exp(x/\alpha) - 1))` | ||
| More details can be found in the paper `Continuously Differentiable Exponential Linear Units`_ . |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
@houseroad @ssnl I've made those improvements. |
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
SsnL has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
Thanks! @zasdfgbnm |
Summary: Also fuse input scale multiplication into ELU Paper: https://arxiv.org/pdf/1704.07483.pdf Pull Request resolved: pytorch/pytorch#8551 Differential Revision: D9088477 Pulled By: SsnL fbshipit-source-id: 877771bee251b27154058f2b67d747c9812c696b
Summary: Also fuse input scale multiplication into ELU Paper: https://arxiv.org/pdf/1704.07483.pdf Pull Request resolved: pytorch#8551 Differential Revision: D9088477 Pulled By: SsnL fbshipit-source-id: 877771bee251b27154058f2b67d747c9812c696b
Also fuse input scale multiplication into ELU
Paper:
https://arxiv.org/pdf/1704.07483.pdf