Skip to content

Conversation

@weiyangfb
Copy link
Contributor

@weiyangfb weiyangfb commented Sep 18, 2018

@gchanan @apaszke

@weiyangfb weiyangfb force-pushed the create_tensor_requires_grad_false branch from d92b7dc to 9e9fdf3 Compare September 18, 2018 19:43
if (r.idx == 0) {
PyObject* data = r.pyobject(0);
if (THPVariable_Check(data)) {
PyErr_WarnEx(PyExc_UserWarning,

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

a[0] = 7.
self.assertEqual(5., res1[0].item())

def test_tensor_factory_copy_var(self):

This comment was marked as off-topic.

This comment was marked as off-topic.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

# test torch.as_tensor()
source = source.add(1) # make source non-leaf
check_copy(torch.as_tensor(source), source.is_leaf, source.requires_grad)
check_copy(torch.as_tensor(source, dtype=torch.float), True, False)

This comment was marked as off-topic.

This comment was marked as off-topic.

r.pyobject(0))
.set_requires_grad(r.toBool(3));
data);
new_tensor.detach_(); // making new_tensor a leaf node

This comment was marked as off-topic.

PyObject* data = r.pyobject(0);
if (THPVariable_Check(data)) {
PyErr_WarnEx(PyExc_UserWarning,
"To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() "

This comment was marked as off-topic.

This comment was marked as off-topic.

if (r.idx == 0) {
PyObject* data = r.pyobject(0);
if (THPVariable_Check(data)) {
PyErr_WarnEx(PyExc_UserWarning,

This comment was marked as off-topic.

args_requires_grad)
.set_requires_grad(r.toBool(3));
type_inference);
new_tensor.detach_(); // making new_tensor a leaf node

This comment was marked as off-topic.

ParsedArgs<3> parsed_args;
auto r = parser.parse(args, kwargs, parsed_args);
if (r.idx == 0) {
at::optional<Device> device_opt = r.deviceOptional(2);

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

Copy link
Contributor

@apaszke apaszke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Semantics of torch.as_tensor need to be clarified.

def check_copy(copy, copy_is_leaf, copy_requires_grad):
self.assertEqual(copy.data, source.data)
self.assertTrue(copy.is_leaf == copy_is_leaf)
self.assertTrue(copy.requires_grad == copy_requires_grad)

This comment was marked as off-topic.

This comment was marked as off-topic.

:func:`as_tensor()` reads out 'the data' from whatever it is passed,
and constructs a leaf variable. Therefore ``tensor.as_tensor(x, dtype=dtype, device=device)``
is equivalent to ``x.clone().detach().to(dtype, device)``.
The equivalents using ``clone()`` and ``detach()`` are recommended.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

if (r.idx == 0) {
PyObject* data = r.pyobject(0);
if (THPVariable_Check(data)) {
PyErr_WarnEx(PyExc_UserWarning,

This comment was marked as off-topic.

typeWithDefault(r, 1, 2, type), r.deviceOptional(2), r.pyobject(0), false, false, type_inference);
bool is_copy = false;

if (THPVariable_Check(data)) {

This comment was marked as off-topic.

This comment was marked as off-topic.

@ssnl
Copy link
Collaborator

ssnl commented Sep 20, 2018

Currently torch.sparse_coo_tensor(i, v, torch.Size(size)) warns

test_autograd.py:576: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach(
) or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).
  x = torch.sparse_coo_tensor(i, v, size)

It doesn't really make sense. Does this patch remove that?

@weiyangfb
Copy link
Contributor Author

weiyangfb commented Sep 20, 2018

@ssnl Yes, it warns because torch.sparse_coo_tensor calls into internal_new_from_data, where I put the warning previously. I removed it at this PR, so it should be gone after this lands.

@weiyangfb weiyangfb force-pushed the create_tensor_requires_grad_false branch from a211893 to 36d298c Compare September 20, 2018 22:23
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@soumith soumith changed the title fix PR #11061 detach returned tensors from source tensors in tensor constructors Sep 21, 2018
iotamudelta pushed a commit to ROCm/pytorch that referenced this pull request Sep 21, 2018
Summary:
- fix PR pytorch#11061 by moving `detach_()` and `set_requires_grad()` to `torch.tensor_ctor()` and `tensor.new_tensor`, and also removed warnings and `args_requires_grad` from `internal_new_from_data `
- with this patch, the returned tensor from `tensor_ctor()` and `new_tensor` will be detached from source tensor, and set requires_grad based on the input args
- `torch.as_tensor` retains its behavior as documented

gchanan apaszke
Pull Request resolved: pytorch#11815

Differential Revision: D9932713

Pulled By: weiyangfb

fbshipit-source-id: 4290cbc57bd449954faadc597c24169a7b2d8259
@ezyang ezyang added the merged label Jun 26, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants