-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Updates serialization note to explain versioned symbols and dynamic versioning #41395
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
💊 CI failures summary and remediationsAs of commit 382b01a (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group. This comment has been revised 2 times. |
docs/source/notes/serialization.rst
Outdated
| reconstruct the view relationships between the loaded tensors. In the above | ||
| snippet, for example, only a single storage is written to 'tensors.pt'. | ||
|
|
||
| In some cases, however, saving storages may be unecessary and create |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: unnecessary, although the wording is a little strange (you have to save something), maybe say "saving the entire storage may be unnecessary/wasteful"?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that's better.
docs/source/notes/serialization.rst
Outdated
| the 999 values in the storage shared with `large` were saved and loaded. | ||
|
|
||
| When saving tensors smaller than their storage(s), the size of the saved file | ||
| can be reduced by first cloning the tensors to be saved. Cloning a tensor |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't know what "containing only its values" means. I'd probably frontload the description below, i.e. say something like: "Cloning works to reduce the serialized because a cloned tensor does not preserve view relationships..."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree with you that "containing only its value" is unclear. I'll improve the clarity.
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mruberry has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
|
||
| Since the cloned tensors are independent of each other, however, they have | ||
| none of the view relationships the original tensors did. If both file size and | ||
| view relationships are important when saving tensors smaller than their |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: "construct new storage objects" sounds like something the user should be doing. Better to say something like "construct new tensors with the desired view relationships before saving, such that the storage object sizes are minimized."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure. Sounds good.
Doc update intended to clarify and expand our current serialization behavior, including explaining the difference between torch.save/torch.load, torch.nn.Module.state_dict/torch.nn.Module.load_state_dict, and torch.jit.save/torch.jit.load. Also explains, for the time, when historic serialized Torchscript behavior is preserved and our recommendation for preserving behavior (using the same PyTorch version to consume a model as produced it).