Skip to content

Conversation

@weiyangfb
Copy link
Contributor

@weiyangfb weiyangfb commented Jun 21, 2018

summary

  1. fixes Deprecate torch.nn.functional.tanh? #6245
  2. deprecated tanh, sigmoid

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

test/test_jit.py Outdated

This comment was marked as off-topic.

test/test_jit.py Outdated

This comment was marked as off-topic.

test/test_jit.py Outdated

This comment was marked as off-topic.

test/test_jit.py Outdated

This comment was marked as off-topic.

@soumith
Copy link
Contributor

soumith commented Jun 21, 2018

what happens if user tries to do torch.nn.functional.sigmoid(x) now, is it an error or a deprecated warning? (it should be a deprecated warning)

@weiyangfb
Copy link
Contributor Author

@soumith ah, I see. Now it is an error, I will make a deprecated warning

@vishwakftw
Copy link
Contributor

vishwakftw commented Jun 21, 2018

Other functions in torch.nn.functional that could be deprecated:

  1. selu
  2. rrelu

Let me know what you think @soumith .

EDIT: there are not many changes if these functions are going to be deprecated, so can be a part of this PR too.

@weiyangfb
Copy link
Contributor Author

weiyangfb commented Jun 25, 2018

@vishwakftw Yes I can deprecate selu and rrelu. But after that people can only use modules nn.SELU and nn.RReLU. Should we do this?

cc @soumith

@soumith
Copy link
Contributor

soumith commented Jun 25, 2018

wait, why would we deprecate relu and rrelu? we dont have torch.relu and torch.rrelu

@weiyangfb
Copy link
Contributor Author

@soumith That's right. We will keep them for now then.

@weiyangfb
Copy link
Contributor Author

@pytorchbot retest this please

@vishwakftw
Copy link
Contributor

vishwakftw commented Jun 25, 2018

I thought we could deprecate torch.nn.functional.rrelu and torch.nn.functional.selu. Both are available as torch.rrelu and torch.selu (and their inplace versions) respectively.

This is how torch.nn.functional.selu works:

def selu(input, inplace=False):
r"""selu(input, inplace=False) -> Tensor
Applies element-wise,
:math:`\text{SELU}(x) = scale * (\max(0,x) + \min(0, \alpha * (\exp(x) - 1)))`,
with :math:`\alpha=1.6732632423543772848170429916717` and
:math:`scale=1.0507009873554804934193349852946`.
See :class:`~torch.nn.SELU` for more details.
"""
if inplace:
return torch.selu_(input)
return torch.selu(input)

@weiyangfb
Copy link
Contributor Author

@vishwakftw Ah, I see! So they are just not documented yet, I will add them to the docs. This way I think functional.selu and functional.rrelu can be safely deprecated then.

input = torch.randn(2)
torch.selu(input)

tensor([1.4261, 0.2250])

input = torch.randn(2)
torch.rrelu(input)

tensor([-0.0398, -0.1174])

@weiyangfb weiyangfb changed the title Deprecated torch.nn.functional.tanh() Deprecated several functions at torch.nn.functional Jun 26, 2018
@fmassa
Copy link
Member

fmassa commented Jun 26, 2018

So the goal now is to expose everything in the torch namespace?

@weiyangfb weiyangfb force-pushed the deprecate_nn_functional_tanh branch from 8a4bf4f to 1d19ab6 Compare June 26, 2018 17:54
@weiyangfb
Copy link
Contributor Author

After offline discussion, we decide to move tanh and sigmoid from nn.functional to torch since they are more of common scientific functions, and keep the others at nn.functional for now.

@weiyangfb weiyangfb force-pushed the deprecate_nn_functional_tanh branch from 1d19ab6 to 4512e83 Compare June 26, 2018 18:20
@weiyangfb
Copy link
Contributor Author

Failed Caffe2 CI tests look not related. Can I get a stamp on this?

@weiyangfb
Copy link
Contributor Author

@pytorchbot retest this please

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@vadimkantorov
Copy link
Contributor

vadimkantorov commented Jun 30, 2018

relu also already is available on torch namespace: both torch.relu(torch.Tensor([1])) and torch.Tensor([1]).relu() work

same for softmax

@soumith
Copy link
Contributor

soumith commented Jul 1, 2018

@vadimkantorov we probably will remove that, please dont rely on undocumented features. the nn namespace is really the place for neural-net specifi functions

@vadimkantorov
Copy link
Contributor

vadimkantorov commented Jul 1, 2018

@soumith ok :) I was hesitating if it was a deficit of documentation or a blunder of auto bindings :)

Then it’s a consistency thing: softmax and sigmoid are used in often similar contexts

@soumith
Copy link
Contributor

soumith commented Jul 1, 2018

the latter :)

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@weiyangfb is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@weiyangfb weiyangfb force-pushed the deprecate_nn_functional_tanh branch from 4512e83 to ce1d853 Compare July 2, 2018 19:59
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@weiyangfb weiyangfb deleted the deprecate_nn_functional_tanh branch July 2, 2018 23:09
goodlux pushed a commit to goodlux/pytorch that referenced this pull request Aug 15, 2018
Summary:
1. fixes pytorch#6245
2. deprecated tanh, sigmoid
Closes pytorch#8748

Differential Revision: D8697975

Pulled By: weiyangfb

fbshipit-source-id: f30714aa0611a1fe870040692f3dbcc8238aece9
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Deprecate torch.nn.functional.tanh?

6 participants