Skip to content

Conversation

@fehiepsi
Copy link
Contributor

This pull request adds HalfNormal, HalfCauchy for distributions. These distributions are transformed distributions with AbsTransform but AbsTransform does not have sign and log_abs_det_jacobian methods. So I think that it is good to implement them.

As mentioned in this user request, these half distributions are popular for prior of a positive variable (currently, I guess Exponential is the only way to set prior to such variable). Missing these distributions add some overhead for benchmarking against other frameworks.

cc @apaszke @fritzo

Copy link
Collaborator

@fritzo fritzo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks reasonable to me.

@jpchen can you confirm that it's ok to omit the loc parameter from these because we only ever use loc=0 in practice? Our Pyro implementation has a loc parameter, but it seems fine to drop it.


def log_prob(self, value):
if self._validate_args:
self._validate_sample(value)

This comment was marked as off-topic.

This comment was marked as off-topic.

Copy link
Contributor

@apaszke apaszke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, apart from the double sample validation (in both distributions)

Copy link
Contributor

@jpchen jpchen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in my experience, these half-distributions are used in hierarchical models and almost always to restrict to a positive support so i suppose it's fine

@fehiepsi
Copy link
Contributor Author

@apaszke The error is due to caffe2 build, which might be irrelevant to this pull request.

@apaszke apaszke merged commit 9d88ff7 into pytorch:master Jun 14, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants