-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Closed
Description
Issue description
Einsum currently modifies variables in-place (without an explicit indication that it does so), which prevents pytorch from automatically backpropagating.
Code example
x = torch.randn(3, 3, requires_grad=True)
z1 = torch.einsum("ij,jk->ik", (x, torch.randn(3, 3)))
z2 = torch.einsum("ij,jk->ik", (x, torch.randn(3, 3)))
z1.sum().backward()
Results in the following runtime error:
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation
Demonstration and discussion can also be found here.
- PyTorch or Caffe2: PyTorch
- How you installed PyTorch (conda, pip, source): pip
- OS:
- PyTorch version: 0.4.0
- Python version: 3.6.1
roozmahdavian, t-vi, rockt, mchhoy, jorgeperezrojas and 7 more
Metadata
Metadata
Assignees
Labels
No labels