Skip to content

Conversation

@neerajprad
Copy link
Contributor

@neerajprad neerajprad commented Apr 18, 2018

Currently the Binomial distribution only allows for integer counts, i.e. total_count remains fixed while we can vary the probability tensor. Since we are drawing Bernoullis internally when sampling from Binomial, there is no need to restrict total_count to be an integer, and we can support vectorized counts with a small modification. This will be really useful for building beta-binomial regression models, for instance.

Additionally, some of the methods assumed that total_count > 0. Made some minor changes to ensure that total_count = 0 works as expected, and added this special case to the tests.

Hat tip to @fritzo for suggesting this. Math reviewed in probtorch#148.

cc. @apaszke

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

@neerajprad
Copy link
Contributor Author

There seems to be an overflow error when we construct the message string in TestConstraints for the distributions test. I am not sure why that's getting triggered.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

@ezyang
Copy link
Contributor

ezyang commented Apr 22, 2018

@yf225 GPU perf test failed on this one

@yf225
Copy link
Contributor

yf225 commented Apr 22, 2018

@ezyang The GPU perf test suite wasn't updated at the time. Fixed in pytorch/examples#332.

@yf225
Copy link
Contributor

yf225 commented Apr 22, 2018

@pytorchbot retest this please

1 similar comment
@ezyang
Copy link
Contributor

ezyang commented Apr 23, 2018

@pytorchbot retest this please

@neerajprad
Copy link
Contributor Author

@apaszke - I will address all your comments shortly. Thanks for reviewing!

@neerajprad
Copy link
Contributor Author

@apaszke - On a related note, are there plans to have implicit casting for operations, like LongTensor + FloatTensor = FloatTensor. For many pyro models, samples from one distribution would go as parameters to another one. So for instance, we might sample a FloatTensor from a Categorical and provide these as counts to a Binomial distribution. This will need a type_as on both ends. It would be great if PyTorch had some implicit casting that would allow for this.

@apaszke
Copy link
Contributor

apaszke commented Apr 25, 2018

LGTM, but the tests are failing.

Yes, we are planning to add more and more support for cross-type ops.

@apaszke
Copy link
Contributor

apaszke commented Apr 26, 2018

@pytorchbot retest this please

@apaszke apaszke merged commit 3964253 into pytorch:master Apr 26, 2018
Jorghi12 pushed a commit to wsttiger/pytorch that referenced this pull request May 10, 2018
weiyangfb pushed a commit to weiyangfb/pytorch that referenced this pull request Jun 11, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants