-
Notifications
You must be signed in to change notification settings - Fork 26.3k
[JIT] Update docs for recently added features #45232
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
**Summary** This commit updates the TorchScript language reference to include documentation on recently-added TorchScript enums. It also removed `torch.no_grad` from the list of known unsupported `torch` modules and classes because it is now supported. **Test Plan** Continuous integration. [ghstack-poisoned]
|
Where's a good place to mention |
| * :class:`torch.nn.RNN` | ||
| * :class:`torch.nn.AdaptiveLogSoftmaxWithLoss` | ||
| * :class:`torch.autograd.Function` | ||
| * :class:`torch.autograd.no_grad` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a good place to mention that record_function is supported now?
💊 CI failures summary and remediationsAs of commit 5b009a6 (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group. This comment has been revised 22 times. |
**Summary** This commit updates the TorchScript language reference to include documentation on recently-added TorchScript enums. It also removed `torch.no_grad` from the list of known unsupported `torch` modules and classes because it is now supported. **Test Plan** Continuous integration. [ghstack-poisoned]
**Summary** This commit updates the TorchScript language reference to include documentation on recently-added TorchScript enums. It also removed `torch.no_grad` from the list of known unsupported `torch` modules and classes because it is now supported. **Test Plan** Continuous integration. [ghstack-poisoned]
eellison
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, a couple comments
|
|
||
| .. warning:: | ||
|
|
||
| TorchScript enum support is experimental.. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IMO we don't need to mark it as experimental - there may be bugs but that's true of everything else too 🤷
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lol
|
|
||
|
|
||
| .. _TorchScript Enums: | ||
| .. _TorchScript Enums: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is the triple repeat here intentional?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I copied over the class types section, I don't know how to write .rst lol
|
|
||
| TorchScript enum support is experimental.. | ||
|
|
||
| Python enums can be used in TorchScript without any extra annotation or code: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Personally I would cut down on this section a bit - it's pretty long for what is an easy & simple feature to use (enums). see how short the named tuple section is
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What do you think I should cut out? Most of it is code samples.
eellison
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Meant to accept - these are just suggestions though, it looks good obviously feel free to keep whatever
|
|
||
| :: | ||
|
|
||
| from enum import Enum | ||
|
|
||
| class IntEnum(Enum): | ||
| FOO = 1 | ||
| BAR = 2 | ||
|
|
||
| class FloatEnum(Enum): | ||
| FOO = 1.2 | ||
| BAR = 2.3 | ||
|
|
||
| class StringEnum(Enum): | ||
| FOO = "foo as in foo bar" | ||
| BAR = "bar as in foo bar" | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would cut out this part
|
|
||
| from enum import Enum | ||
|
|
||
| class Color(Enum): | ||
| RED = 1 | ||
| GREEN = 2 | ||
|
|
||
| @torch.jit.script | ||
| def return_enum(cond: bool): | ||
| if cond: | ||
| return Color.RED | ||
| else: | ||
| return Color.GREEN | ||
|
|
||
| self.assertEqual(return_enum(True), Color.RED) | ||
| self.assertEqual(return_enum(False), Color.GREEN) | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And this part
| if cond: | ||
| return Color.RED | ||
| else: | ||
| return Color.GREEN |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And then maybe include this example where you access the field name Color.RED or make it part of the other function
**Summary** This commit updates the TorchScript language reference to include documentation on recently-added TorchScript enums. It also removed `torch.no_grad` from the list of known unsupported `torch` modules and classes because it is now supported. **Test Plan** Continuous integration. [ghstack-poisoned]
**Summary** This commit updates the TorchScript language reference to include documentation on recently-added TorchScript enums. It also removed `torch.no_grad` from the list of known unsupported `torch` modules and classes because it is now supported. **Test Plan** Continuous integration. [ghstack-poisoned]
**Summary** This commit updates the TorchScript language reference to include documentation on recently-added TorchScript enums. It also removed `torch.no_grad` from the list of known unsupported `torch` modules and classes because it is now supported. **Test Plan** Continuous integration. [ghstack-poisoned]
Codecov Report
@@ Coverage Diff @@
## gh/splitinfinity/49/base #45232 +/- ##
============================================================
- Coverage 68.05% 68.05% -0.01%
============================================================
Files 396 396
Lines 51235 51235
============================================================
- Hits 34868 34867 -1
- Misses 16367 16368 +1
Continue to review full report at Codecov.
|
**Summary** This commit updates the TorchScript language reference to include documentation on recently-added TorchScript enums. It also removed `torch.no_grad` from the list of known unsupported `torch` modules and classes because it is now supported. **Test Plan** Continuous integration. [ghstack-poisoned]
**Summary** This commit updates the TorchScript language reference to include documentation on recently-added TorchScript enums. It also removed `torch.no_grad` from the list of known unsupported `torch` modules and classes because it is now supported. **Test Plan** Continuous integration. [ghstack-poisoned]
|
@SplitInfinity merged this pull request in 4af4b71. |
Stack from ghstack:
Summary
This commit updates the TorchScript language reference to include
documentation on recently-added TorchScript enums. It also removed
torch.no_gradfrom the list of known unsupportedtorchmodules andclasses because it is now supported.
Test Plan
Continuous integration.
Differential Revision: D23971884