且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

如何在 TensorFlow 中实现 Kullback-Leibler 损失?

更新时间:2023-12-02 21:03:46

你有的是交叉熵,KL散度应该是这样的:

What you have there is the cross entropy, KL divergence should be something like:

def kl_divergence(p, q): 
    return tf.reduce_sum(p * tf.log(p/q))

这里假设 p 和 q 都是浮点数的一维张量,形状相同,并且对于每个张量,它们的值总和为 1.

This assumes that p and q are both 1-D tensors of float, of the same shape and for each, their values sum to 1.

如果 p 和 q 是符合上述约束的大小相同的一维张量小批量,它也应该有效.

It should also work if p and q are equally sized mini-batches of 1-D tensors that obey the above constraints.