[Release/1.7] aten::set_grad_enabled should not push as it does not return a value #46060
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Cherry-pick of #45559 into release/1.7
Summary
Fixes #45558
This assertion failure is caused by the incorrect implementation of
aten::set_grad_enabledin torch/csrc/jit/runtime/register_special_ops.cpp. The current implementation is:which push a
Noneon to the evaluation stack after callingset_enabled. But according to the signature, the behavior is incorrect as the signature says this function won't return a value. I guess the original author might be confused by the behavior of Python, which pushes aNoneon to the evaluation stack when the function definition does not end with a return statement with an explicit result value.If
aten::set_grad_enabledpushes aNoneon to the evaluation stack, each time it's called, the evaluation stack will accumulate an extraNone. In our case,with torch.no_grad():will causeaten::set_grad_enabledto be called twice, so when theforwardmethod finishes, the evaluation stack will be[None, None, Tensor]. But the return statement ofGraphFunction::operator()in torch/csrc/jit/api/function_impl.cpp isreturn stack.front();which will try to extract a tensor out of aNonethus causes the assertion failure.The solution is simple, just remove the push in the implementation of
aten::set_grad_enabled.Pull Request resolved: #45559
Reviewed By: albanD
Differential Revision: D24142153
Pulled By: SplitInfinity
fbshipit-source-id: 75aad0e38bd912a437f7e1a1ee89ab4445e35b5d