-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Fix pointwise loss broadcast #12996
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix pointwise loss broadcast #12996
Conversation
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ailzhang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
| reduction = _Reduction.get_enum(reduction) | ||
| return torch._C._nn.smooth_l1_loss(input, target, reduction) | ||
| reduction = _Reduction.legacy_get_string(size_average, reduce) | ||
| return _pointwise_loss( |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
…loss_broadcast
|
@pytorchbot retest this please |
torch/nn/functional.py
Outdated
| return torch._C._nn.smooth_l1_loss(input, target, reduction) | ||
| reduction = _Reduction.legacy_get_string(size_average, reduce) | ||
| return _pointwise_loss( | ||
| lambda a, b: torch.where(torch.abs(a - b) < 1, 0.5 * (torch.abs(a - b)) ** 2, torch.abs(a - b) - 0.5), |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ailzhang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
can we have a test? |
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ailzhang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
test/test_nn.py
Outdated
| # When target.requires_grad=True, its impl is in Python, while the other is in TH. | ||
| target = torch.ones(2, 10, requires_grad=requires_grad) | ||
| l = fn(input, target, 'none') | ||
| self.assertEqual(l.size(), target.size()) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ailzhang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ailzhang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
test/test_nn.py
Outdated
| for requires_grad in (True, False): | ||
| # When target.requires_grad=True, its impl is in Python, while the other is in TH. | ||
| target = torch.randn(2, 10, requires_grad=requires_grad) | ||
| l = fn(input, target, 'none') |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ailzhang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
This has been landed to master, not sure why it didn't get closed tho. Closing manually. |
Fixes #12129 , #12327
cc: @fmassa
_pointwise_lossis currently only called byl1_loss,mse_lossandsmooth_l1_losswhich all makes sense to have broadcast.