KL Divergence


KL Divergence

In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy) is a measure of how one probability distribution is different from a second, reference probability distribution.

https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence


S=k_B ln\Omega

\\ \Omega:\texttt{the number of microstates}\\ S: \texttt{entropy} \\ k_B:\texttt{Boltzmann constant}

\Rightarrow

Information entropy

H=-\sum_{i=1}^{N}p(x_i)lnp(x_i)

\Rightarrow

KL Divergence

D_{KL}(p||q)=\sum_{i=1}^{N}p(x_i)\cdot (ln\ p(x_i)-ln\ q(x_i)) =\sum_{i=1}^{N}p(x_i)\cdot ln\frac{p(x_i)}{q(x_i)}


 

Sidebar