-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Allow ReflectionPad to accept 0-dim batch sizes. #39231
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
💊 CI failures summary and remediationsAs of commit c2020a3 (more details on the Dr. CI page):
🕵️ 7 new failures recognized by patternsThe following CI failures do not appear to be due to upstream breakages:
|
|
CUDA test failures are real. Do you know what the issue is? |
|
@mruberry problems have been resolved. Could you please review? |
|
|
||
| def test_ReflectionPad_empty(self, device): | ||
| for mod, inp in [ | ||
| (torch.nn.ReflectionPad1d(2), torch.randn(0, 3, 10, device=device)), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think you're covering all the cases? Don't you need to check ReflectionPad1D with a 2D and 3D input with a zero batch size plus ReflectionPad2D with a 3D and 4D input with a zero batch size?
You should also add test cases that fail, where the non-batch dimension is zero and you assert you hit the appropriate error (self.assertRaisesRegex(...)), and a check for backward when the batch dim is zero.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah yes good point. Will add them thank you.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The checks for 2D/3D input are done in the python code in functional.py and the ReflectionPad1D does not get called unless the input is specifically dim=3. Same goes for 2D reflection pad (with dim=4). Therefore such tests are not needed.
However I have added tests for non-batch dimension being zero. The backward is tested in the _test_module_empty_input.
Looks pretty good and clean, but you need to add more test cases to validate all the new behavior. |
|
@mruberry can you please review this PR again? |
mruberry
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the follow-up, v0dro! I added a decorator to the test that should resolve the XLA build failure. If the tests pass this should be good to go!
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mruberry has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
This was successfully merged. Thanks v0dro! |
Allows ReflectionPad 1D and 2D to accept 0-dim batch sizes.
Related to issues: