btguijiyang
btguijiyang
采纳率100%
2019-05-05 14:51

tensorflow自定义的损失函数 focal_loss出现inf,在训练过程中出现inf

图片说明

def focal_loss(alpha=0.25, gamma=2.):
    """ focal loss used for train positive/negative samples rate out 
    of balance, improve train performance
    """
    def focal_loss_calc(y_true, y_pred):
        positive = tf.where(tf.equal(y_true, 1), y_pred, tf.ones_like(y_pred))
        negative = tf.where(tf.equal(y_true, 0), y_pred, tf.zeros_like(y_pred))
        return -(alpha*K.pow(1.-positive, gamma)*K.log(positive) +
                       (1-alpha)*K.pow(negative, gamma)*K.log(1.-negative))
    return focal_loss_calc
self.keras_model.compile(optimizer=optimizer, loss=dice_focal_loss, metrics=[ mean_iou, dice_loss, focal_loss()])

上面的focal loss 开始还是挺正常的,随着训练过程逐渐减小大0.025左右,然后就突然变成inf。何解

  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 复制链接分享
  • 邀请回答

1条回答

  • btguijiyang btguijiyang 2年前

    解决方法

    def focal_loss_calc(alpha=0.25, gamma=2., epsilon=1e-6):
        """ focal loss used for train positive/negative samples rate out 
        of balance, improve train performance
        """
        def focal_loss(y_true, y_pred):
            positive = tf.where(tf.equal(y_true, 1), y_pred, tf.ones_like(y_pred))
            negative = tf.where(tf.equal(y_true, 0), y_pred, tf.zeros_like(y_pred))
            return -alpha*K.pow(1.-positive, gamma)*K.log(positive+epsilon) - \
                (1-alpha)*K.pow(negative, gamma)*K.log(1.-negative+epsilon)
        return focal_loss
    
    点赞 2 评论 复制链接分享

相关推荐