Skip to content

when training “the loss_d_real is negative value ” is OK? and why #114

@talengu

Description

@talengu

The result is amazing! I have two questions about loss.

  1. But I have a question that the three loss below will influence each other?some positive some negative.

“self._loss_d_real + self._loss_d_cond + self._loss_d_fake”

  1. when I am training “loss_d_real is negative value” is right" and why
self._loss_d_real = self._compute_loss_D(d_real_img_prob, True) * self._opt.lambda_D_prob
def _compute_loss_D(self, estim, is_real):
        return -torch.mean(estim) if is_real else torch.mean(estim)

批注 2020-05-31 114808
image

image

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions