-
Notifications
You must be signed in to change notification settings - Fork 26.3k
[distributions] Refactor _log_sum_exp
#9173
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| self._validate_sample(value) | ||
| logits, value = broadcast_all(self.logits, value) | ||
| return -binary_cross_entropy_with_logits(logits, value, reduce=False) | ||
| return -binary_cross_entropy_with_logits(logits, value, reduction='none') |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
|
||
| def entropy(self): | ||
| return binary_cross_entropy_with_logits(self.logits, self.probs, reduce=False) | ||
| return binary_cross_entropy_with_logits(self.logits, self.probs, reduction='none') |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
|
||
| def entropy(self): | ||
| return binary_cross_entropy_with_logits(self.logits, self.probs, reduce=False) / self.probs | ||
| return binary_cross_entropy_with_logits(self.logits, self.probs, reduction='none') / self.probs |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
@pytorchbot test this please |
This reverts commit ceae2be.
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
* upstream/master: (24 commits) Implement tensor weak references (pytorch#9363) Nuke TestCollectEnv (pytorch#9459) Add test case for segmentation fault fix in grad_fn (pytorch#9457) Add peephole optimization for type_as operators. (pytorch#9316) Fix out-of-range error for test_neg (pytorch#9431) add depthwise conv support for mkldnn (pytorch#8782) Refactor `_log_sum_exp` (pytorch#9173) Add ModuleDict and ParameterDict containers (pytorch#8463) Introduce SupervisedPtr, delete THAllocator and THCDeviceAllocator (pytorch#9358) Introducing IsInf (pytorch#9169) add device to CUDAEvent (pytorch#9415) Make localScalar error message more intuitive (pytorch#9443) Only accept continguous tensors in TopK for cuda (pytorch#9441) Add support for .norm() pytorch onnx export and ReduceL1/ReduceL2 caffe2 operators (pytorch#9299) Only view() rhs of index_put if we need to (pytorch#9424) Add BatchBucketizeOp in caffe2 (pytorch#9385) Implementation of Wngrad optimizer caffe2 python wrapper and unit test on least square regression (pytorch#9001) Implementation and operator test for Wngrad optimizer (pytorch#8999) Fix segmentation fault in grad_fn (pytorch#9292) update docs (pytorch#9423) ...
Summary: This PR removes `distributions.utils._log_sum_exp` in favor of `torch.logsumexp`. Also fixes some warnings with `reduce` arg. in `binary_cross_entropy_with_logits` Pull Request resolved: pytorch#9173 Reviewed By: SsnL Differential Revision: D8764174 Pulled By: ezyang fbshipit-source-id: b9c4136dbf0182e8ae77082e6448d23a430d5cb6
Summary: This PR removes `distributions.utils._log_sum_exp` in favor of `torch.logsumexp`. Also fixes some warnings with `reduce` arg. in `binary_cross_entropy_with_logits` Pull Request resolved: pytorch#9173 Reviewed By: SsnL Differential Revision: D8764174 Pulled By: ezyang fbshipit-source-id: b9c4136dbf0182e8ae77082e6448d23a430d5cb6
Summary: This PR removes `distributions.utils._log_sum_exp` in favor of `torch.logsumexp`. Also fixes some warnings with `reduce` arg. in `binary_cross_entropy_with_logits` Pull Request resolved: pytorch#9173 Reviewed By: SsnL Differential Revision: D8764174 Pulled By: ezyang fbshipit-source-id: b9c4136dbf0182e8ae77082e6448d23a430d5cb6
This PR removes
distributions.utils._log_sum_expin favor oftorch.logsumexp. Also fixes some warnings withreducearg. inbinary_cross_entropy_with_logits