Skip to content

log_softmax + torch.diag backward() error #4299

@zou3519

Description

@zou3519

Reported here: https://discuss.pytorch.org/t/torch-diag-of-a-non-square-matrix-backward-error/11426

MWE: (tested on master)

import torch
aa = torch.nn.functional.log_softmax(torch.autograd.Variable(torch.randn(4, 3), requires_grad=1), 1)                                   
b = torch.diag(aa, 1)
c = torch.sum(b)
c.backward()
>>>Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/rzou/pytorch/torch/autograd/variable.py", line 110, in backward
    torch.autograd.backward(self, gradient, retain_graph, create_graph, retain_variables)
  File "/home/rzou/pytorch/torch/autograd/__init__.py", line 98, in backward
    variables, grad_variables, retain_graph)
RuntimeError: output and gradOutput shapes do not match: output [4 x 3], gradOutput [3 x 3] at /home/rzou/pytorch/aten/src/THNN/generic/

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions