-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Make fmod work with zero divisors consistently #41948
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
peterbell10
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A couple nits but overall looks good. The scalar kernel must have been overlooked in #38919.
6e19bc6 to
24c40f2
Compare
Currently `torch.tensor(1, dtype=torch.int).fmod(0)` crashes (floating point exception). This PR should fix this issues.
💊 CI failures summary and remediationsAs of commit 95a921a (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group. This comment has been revised 1 time. |
|
@peterbell10 I've updated. Is this ready to merge? |
peterbell10
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
|
@pytorchbot merge this please |
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Currently
torch.tensor(1, dtype=torch.int).fmod(0)crashes (floating point exception).This PR should fix this issue.