-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Closed
Labels
Description
Reported here: https://discuss.pytorch.org/t/torch-diag-of-a-non-square-matrix-backward-error/11426
MWE: (tested on master)
import torch
aa = torch.nn.functional.log_softmax(torch.autograd.Variable(torch.randn(4, 3), requires_grad=1), 1)
b = torch.diag(aa, 1)
c = torch.sum(b)
c.backward()
>>>Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/rzou/pytorch/torch/autograd/variable.py", line 110, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph, retain_variables)
File "/home/rzou/pytorch/torch/autograd/__init__.py", line 98, in backward
variables, grad_variables, retain_graph)
RuntimeError: output and gradOutput shapes do not match: output [4 x 3], gradOutput [3 x 3] at /home/rzou/pytorch/aten/src/THNN/generic/