Skip to content

Conversation

@sccbhxc
Copy link

@sccbhxc sccbhxc commented Apr 13, 2018

No description provided.

Copy link
Contributor

@zou3519 zou3519 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the catch! It is intended that we're using ctx.saved_tensors instead of ctx.saved_variables.

@staticmethod
def backward(ctx, grad_output):
# This is a pattern that is very convenient - at the top of backward
# unpack saved_tensors and initialize all gradients w.r.t. inputs to

This comment was marked as off-topic.

# None. Thanks to the fact that additional trailing Nones are
# ignored, the return statement is simple even when the function has
# optional inputs.
input, weight, bias = ctx.saved_tensors

This comment was marked as off-topic.

@ssnl
Copy link
Collaborator

ssnl commented May 1, 2018

@sccbhxc Thanks for your PR. This is good to merge once the minor nit is fixed.

@zou3519
Copy link
Contributor

zou3519 commented May 21, 2018

#7725 fixes this, so I'm closing this PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants