Skip to content

Conversation

@weiyangfb
Copy link
Contributor

@weiyangfb weiyangfb commented Jul 27, 2018

loss = F.multilabel_soft_margin_loss(outputs, labels, reduction='none')
loss_mean = F.multilabel_soft_margin_loss(outputs, labels, reduction='elementwise_mean')
loss_sum = F.multilabel_soft_margin_loss(outputs, labels, reduction='sum')

loss.sum() == loss_sum  # True
loss.mean() == loss_mean  # True

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

Copy link
Contributor

@soumith soumith left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

approved. change comment.

This comment was marked as off-topic.

This comment was marked as off-topic.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@weiyangfb weiyangfb force-pushed the stable_multilabel_soft_margin_loss branch from 1376a8b to 210f88e Compare July 31, 2018 02:39
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

This comment was marked as off-topic.

This comment was marked as off-topic.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@weiyangfb weiyangfb changed the title use logsigmoid at multilabel_soft_margin_loss [wip] use logsigmoid at multilabel_soft_margin_loss, fix 'elementwise_mean' Aug 1, 2018
@weiyangfb weiyangfb force-pushed the stable_multilabel_soft_margin_loss branch from 9ea862e to 5a10812 Compare August 2, 2018 15:23
@li-roy
Copy link
Contributor

li-roy commented Aug 2, 2018

Why are we changing the behavior of all these losses?

@weiyangfb weiyangfb force-pushed the stable_multilabel_soft_margin_loss branch from 5a10812 to c2b71e0 Compare August 2, 2018 20:39
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@weiyangfb weiyangfb changed the title [wip] use logsigmoid at multilabel_soft_margin_loss, fix 'elementwise_mean' use logsigmoid at multilabel_soft_margin_loss, and change output from shape=(N, C)to (N,) Aug 2, 2018
return binary_cross_entropy(input, target, weight, None, None, reduction)

loss = -(target * logsigmoid(input) + (1 - target) * logsigmoid(-input))
loss.sum(dim=0) # only return N loss values

This comment was marked as off-topic.

This comment was marked as off-topic.

@weiyangfb
Copy link
Contributor Author

@li-roy I made changes accordingly, maybe take a look when you get a chance?

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@li-roy li-roy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good for the most part, just need to be consistent in the behavior.

loss = loss.sum(dim=1) # only return N loss values

if reduction == 'none':
return loss

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

@li-roy
Copy link
Contributor

li-roy commented Aug 3, 2018

If you're planning to go with averaging across classes, don't forget to change the formula in the doc in torch/nn/modules/loss.py

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

goodlux pushed a commit to goodlux/pytorch that referenced this pull request Aug 15, 2018
… shape=(N, C)to (N,) (pytorch#9965)

Summary:
- fixes pytorch#9141, pytorch#9301
- use logsigmoid at multilabel_soft_margin_loss to make it more stable (NOT fixing legacy MultiLabelSoftMarginCriterion)
- return (N) instead of (N, C) to match the same behavior as MultiMarginLoss
- Note that with this PR, the following behavior is expected:
```
loss = F.multilabel_soft_margin_loss(outputs, labels, reduction='none')
loss_mean = F.multilabel_soft_margin_loss(outputs, labels, reduction='elementwise_mean')
loss_sum = F.multilabel_soft_margin_loss(outputs, labels, reduction='sum')

loss.sum() == loss_sum  # True
loss.mean() == loss_mean  # True
```
Pull Request resolved: pytorch#9965

Differential Revision: D9038402

Pulled By: weiyangfb

fbshipit-source-id: 0fa94c7b3cd370ea62bd6333f1a0e9bd0b8ccbb9
@ezyang ezyang added the merged label Jun 26, 2019
input = torch.sigmoid(input)
return binary_cross_entropy(input, target, weight, None, None, reduction)

loss = -(target * logsigmoid(input) + (1 - target) * logsigmoid(-input))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this could probably use torch.lerp (yet AFAIK it fails export to ONNX and has NaN problems: #71701)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Make torch.nn.functional.multilabel_soft_margin_loss more stable

6 participants