-
Notifications
You must be signed in to change notification settings - Fork 26.3k
[fix] torch.repeat : dim-0 backward #45212
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@albanD Please review :) |
💊 CI failures summary and remediationsAs of commit e55e126 (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group. This comment has been revised 5 times. |
|
@pytorchbot rebase this please |
| } | ||
|
|
||
| Tensor repeat_backward(Tensor grad, int64_t input_dims, IntArrayRef repeats) { | ||
| Tensor repeat_backward(Tensor grad, int64_t input_dims, IntArrayRef repeats, IntArrayRef input_shape) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: you can remove the input_dims argument now that you have all the shapes.
test/test_autograd.py
Outdated
| c.backward() | ||
| self.assertEqual(b.grad, torch.tensor([-inf, 0., 0.])) | ||
|
|
||
| def test_repeat_zero_dim(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You don't have to do a custom test here. You can simply add a new entry here that specifies the input size and the arguments.
So something like ('repeat', (2), (0,), 'zero_repeat'), will do the same thing.
|
Thanks for sending a PR! |
Codecov Report
@@ Coverage Diff @@
## master #45212 +/- ##
==========================================
- Coverage 68.01% 68.01% -0.01%
==========================================
Files 393 393
Lines 50847 50847
==========================================
- Hits 34583 34582 -1
- Misses 16264 16265 +1
Continue to review full report at Codecov.
|
|
@albanD Have addressed the comments. PTAL:) |
albanD
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perfect! Thanks for the update!
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@albanD has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Fixes #45201