-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Fix serialization for Parameters #8633
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
torch/nn/parameter.py
Outdated
| def __reduce_ex__(self, proto): | ||
| tensor = torch._utils._rebuild_tensor_v2(self.storage(), self.storage_offset(), tuple(self.size()), | ||
| self.stride(), self.requires_grad, self._backward_hooks) | ||
| return (Parameter, (tensor, self.requires_grad)) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| def __repr__(self): | ||
| return 'Parameter containing:\n' + super(Parameter, self).__repr__() | ||
|
|
||
| def __reduce_ex__(self, proto): |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
torch/nn/parameter.py
Outdated
| def __reduce_ex__(self, proto): | ||
| tensor = torch._utils._rebuild_tensor_v2(self.storage(), self.storage_offset(), tuple(self.size()), | ||
| self.stride(), self.requires_grad, self._backward_hooks) | ||
| return (Parameter, (tensor, self.requires_grad)) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| import cPickle as pickle | ||
| else: | ||
| import pickle | ||
| a = torch.nn.Parameter(torch.randn(5, 5)) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
ssnl
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this would be good to merge if a new test is added.
ssnl
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Thanks!
* upstream/master: (92 commits) more formatting (pytorch#8701) Fix pytorch#8692 (pytorch#8699) Create captured inputs recursively for loop to resolve loop-carried dependencies across nested blocks (pytorch#8345) Shard test_nn to reduce runtime for each test target (pytorch#8678) Create at::tensor (pytorch#8475) Clarify mp note about sharing a tensor's grad field. (pytorch#8688) Add owner rule for cpp_extension.py (pytorch#8700) fix formatting in :math: in fold docstring (pytorch#8696) Some 0-sized dimension support, port catArray away from resizeLegacy. (pytorch#8666) Implement flatten function (pytorch#8578) Created Tensor::to functions (pytorch#8643) Add a warning in gradcheck if inputs precision < float64 (pytorch#8663) Fix parsing of floating point defaults in python_arg_parser (pytorch#8681) Export ProcessGroupGloo options to Python (pytorch#8664) Fix build error in pybind_state_ideep (pytorch#8684) Compatibility: write nDimension/_nDimension corresponding to dim()/_dim(). (pytorch#8676) Improve win-build.sh for local build (pytorch#8674) don't do unnecessary copies for bernoulli_ (pytorch#8682) Use parallel if get_num_threads 0 (pytorch#8677) Fix serialization for Parameters (pytorch#8633) ...
Extending from pytorch#8633, this allows `requires_grad` to be restored when serialized using `keep_vars=True`. Potentially rename `keep_vars` as I am guessing this is a reference to the old `Variable` ?
Parameter was inheriting Tensor's
__reduce_ex__(), which caused pickling to not work properly.Fixes #8077.
cc: @zou3519