-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Fix embedding bag nan output when input is empty #21400
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
```
import torch
Embed = torch.nn.EmbeddingBag(100, 10, sparse=True)
print(Embed(input=torch.LongTensor([]), offsets=torch.LongTensor([0])))
print(Embed(input=torch.LongTensor([]), offsets=torch.LongTensor([0, 0])))
```
Before this fix:
```
tensor([[nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]])
tensor([[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]])
```
After this fix:
```
tensor([[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]])
tensor([[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]])
```
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@bddppq has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
zou3519
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add test please
ezyang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
needs test
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@bddppq has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@bddppq has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
zou3519
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm, thank you!
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@bddppq has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Summary:
```
import torch
Embed = torch.nn.EmbeddingBag(100, 10, sparse=True)
print(Embed(input=torch.LongTensor([]), offsets=torch.LongTensor([0])))
print(Embed(input=torch.LongTensor([]), offsets=torch.LongTensor([0, 0])))
```
Before this fix:
```
tensor([[nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]])
tensor([[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]])
```
After this fix:
```
tensor([[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]])
tensor([[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]])
```
Pull Request resolved: pytorch/pytorch#21400
Differential Revision: D15643357
Pulled By: bddppq
fbshipit-source-id: 119eba38129dc0a3757c331304a18044714fcca5
Before this fix:
After this fix: