Skip to content

Conversation

@gokceneraslan
Copy link
Contributor

@gokceneraslan gokceneraslan commented Oct 28, 2017

Evaluation of the logarithm of the input variable in poisson negative log likelihood leads to NaN loss if variable being evaluated is zero. Small epsilon is added to prevent this. See equivalent Keras epsilon here: https://github.com/fchollet/keras/blob/master/keras/losses.py#L68

Here is the code to reproduce the issue:

import numpy as np
import torch
from torch.autograd import Variable

torch.manual_seed(42)
np.random.seed(42)

poisson_mean = 4
poisson_numsample = 200

samples = np.random.poisson(poisson_mean, poisson_numsample)

param = Variable(torch.zeros(1), requires_grad=True)
torch_samples = Variable(torch.from_numpy(samples).float(), requires_grad=False)

optimizer = torch.optim.RMSprop([param], lr=0.1)
loss = torch.nn.PoissonNLLLoss(log_input=False, full=True)

for i in range(5000):
    optimizer.zero_grad()
    output = loss(param, torch_samples)
    output.backward()

    if i % 1000 == 0:
        print('Loss:', output.data[0])

    optimizer.step()

print('Sample mean: ', torch_samples.mean().data.numpy())
print('Mean is : ', param.data.numpy())

Evaluation of the logarithm of the input variable in poisson negative log likelihood leads to NaN loss if variable being evaluated is zero. Small epsilon is added to prevent this. See equivalent Keras epsilon here: https://github.com/fchollet/keras/blob/master/keras/losses.py#L68
@soumith soumith merged commit 638f0b5 into pytorch:master Nov 1, 2017
@soumith
Copy link
Contributor

soumith commented Nov 1, 2017

thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants