Skip to content

Conversation

@ita9naiwa
Copy link
Contributor

@ita9naiwa ita9naiwa commented Aug 4, 2020

#40980

I have a few questions during implementing Polygamma function...
so, I made PR prior to complete it.

  1. some code blocks brought from cephes library(and I did too)
/*
 * The following function comes with the following copyright notice.
 * It has been released under the BSD license.
 *
 * Cephes Math Library Release 2.8:  June, 2000
 * Copyright 1984, 1987, 1992, 2000 by Stephen L. Moshier
 */

is it okay for me to use cephes code with this same copyright notice(already in the Pytorch codebases)

  1. There is no linting in internal Aten library. (as far as I know, I read https://github.com/pytorch/pytorch/blob/master/CONTRIBUTING.md)
    How do I'm sure my code will follow appropriate guidelines of this library..?

  2. Actually, there's a digamma, trigamma function already
    digamma is needed, however, trigamma function becomes redundant if polygamma function is added.
    it is okay for trigamma to be there or should be removed?

btw, CPU version works fine with 3-rd order polygamma(it's what we need to play with variational inference with beta/gamma distribution) now and I'm going to finish GPU version soon.

@dr-ci
Copy link

dr-ci bot commented Aug 4, 2020

💊 CI failures summary and remediations

As of commit 5f9d526 (more details on the Dr. CI page):


None of the CI failures appear to be your fault 💚



🚧 1 ongoing upstream failure:

These were probably caused by upstream breakages that are not fixed yet:


🚧 2 fixed upstream failures:

These were probably caused by upstream breakages that were already fixed.

Please rebase on the viable/strict branch (expand for instructions)

If your commit is newer than viable/strict, you can try basing on an older, stable commit:

git fetch https://github.com/pytorch/pytorch viable/strict
git rebase --onto FETCH_HEAD $(git merge-base origin/master HEAD)

If your commit is older than viable/strict:

git fetch https://github.com/pytorch/pytorch viable/strict
git rebase FETCH_HEAD

Check out the recency history of this "viable master" tracking branch.


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 97 times.

@ita9naiwa ita9naiwa marked this pull request as draft August 4, 2020 01:19
@ita9naiwa ita9naiwa marked this pull request as ready for review August 4, 2020 09:29
@ita9naiwa ita9naiwa changed the title [WIP] Add polygamma Add polygamma where n >= 2 Aug 4, 2020
@ita9naiwa ita9naiwa closed this Aug 5, 2020
@ita9naiwa ita9naiwa reopened this Aug 5, 2020
@ita9naiwa
Copy link
Contributor Author

Hi @albanD . I mention you because I finished implementing polygamma things but ci makes some errors that I cannot fix.

@ailzhang ailzhang requested review from albanD and gchanan August 7, 2020 01:49
@ailzhang ailzhang added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Aug 7, 2020
@albanD
Copy link
Collaborator

albanD commented Aug 10, 2020

Hi,
Thanks for the PR!

Yes this issue was on master :/ You can rebase on top of master to have it disappear (note that you can see the CI status for the master branch here: https://ezyang.github.io/pytorch-ci-hud/build2/pytorch-master).

I'll take a look at the code tomorrow.

Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, Thanks for the PR!

Some high level questions:

  • Does the doc need an update to state what now works? Also if we only test 0,1,2,3 it might be worth mentioning it.
  • Does the autograd test need to be updated to test higher values of n as well? And checking it, I can't find it in the generic autograd tests, so is the backward actually checked at all?

@ita9naiwa
Copy link
Contributor Author

ita9naiwa commented Aug 12, 2020

Thanks for your kind review!

  • Does the doc need an update to state what now works? Also if we only test 0,1,2,3 it might be worth mentioning it.

It must work for arbitrary integer values higher than 0, 1, 2, 3... I omitted these cases because as far as i know, main purpose of only up to third orders of gamma functions are commonly used in deep learning(especially variational inference involving gamma distribution)

It's okay to add tests to check higher n (but IMHO tests should be handled carefully because it may give slightly (up to 1e-8~10) different values because of numerical rounding.

  • Does the autograd test need to be updated to test higher values of n as well? And checking it, I can't find it in the generic autograd tests, so is the backward actually checked at all?

I added. I'm not sure grad tests should be placed there (around line 18703 at test/test_torch.py)

And I'd like to say sorry I made dirty commits.

@albanD
Copy link
Collaborator

albanD commented Aug 12, 2020

And I'd like to say sorry I made dirty commits.

Don't worry about this.
They will be automatically squashed before being merged by the bot.

Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good!
Can you rebase on top of master please to make sure that all the onnx test are passing fine?

@ita9naiwa
Copy link
Contributor Author

Okay, CI will be happy with this now

@albanD
Copy link
Collaborator

albanD commented Aug 13, 2020

Perfect! Thanks!
The onnx build is temporarily broken because of some dependency that broke :/ So you can ignore this one!

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@albanD has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@albanD merged this pull request in 91b090c.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants