-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Fix repeat non owning #4084
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix repeat non owning #4084
Conversation
These test tensor.repeat when using numpy-shared storage. The new test adds some additional test cases and compares against np.tile.
Clone the input tensor instead of resizing it in place.
| raise ValueError('Number of dimensions of repeat dims can not be ' | ||
| 'smaller than number of dimensions of tensor') | ||
|
|
||
| xtensor = src.new().set_(src) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
Updated to use |
torch/tensor.py
Outdated
|
|
||
| size = torch.Size([a * b for a, b in zip(xsize, repeats)]) | ||
| xtensor.resize_(torch.Size(xsize)) | ||
| xtensor = xtensor.view(torch.Size(xsize)) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
I added some explanatory comments to the function itself and added a test that repeats a non-contiguous input.
|
@apaszke I moved to using an |
apaszke
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good. Some minor comments, and should be good to merge once they are resolved.
| xxtensor = xtensor.expand_as(urtensor) | ||
| urtensor.copy_(xxtensor) | ||
|
|
||
| urtensor.copy_(xtensor.expand_as(urtensor)) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| result = tensor.repeat(torchSize) | ||
| self.assertEqual(result.size(), target, 'Error in repeat using result and LongStorage') | ||
| self.assertEqual(result.mean(0).view(8, 4), tensor, 'Error in repeat (not equal)') | ||
|
|
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
Thank you! |
Tentative solution for #4054.
I must admit that the actual details of what is happening in
tensor.repeatremain somewhat opaque to me, so I opted for the minimal possible change.I would be happy to add some more robust tests; have you looked into Hypothesis? It's probably the most advanced property-based testing library out there, and is extremely good at making it easier to write effective tests.