Skip to content

Conversation

@zou3519
Copy link
Contributor

@zou3519 zou3519 commented Jan 8, 2018

Fixes #4299

Previously, the gradient of a non-square matrix input to torch.diag would be a square matrix. This fixes that.

Test Plan

New unit test in test_autograd. Also run the following and verify the size of x.grad is okay:

x = Variable(torch.randn(3, 5), requires_grad=True)
x.diag().sum().backward()
x.grad.size()  # assert this is (3, 5)

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

Copy link
Collaborator

@ssnl ssnl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@soumith soumith merged commit e4eaf67 into pytorch:master Feb 5, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

log_softmax + torch.diag backward() error

5 participants