#### Blog Stats

• 53,313 hits

# KL Divergence

### KL Divergence

In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy) is a measure of how one probability distribution is different from a second, reference probability distribution.

https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence

$S=k_B&space;ln\Omega$

$\\&space;\Omega:\texttt{the&space;number&space;of&space;microstates}\\&space;S:&space;\texttt{entropy}&space;\\&space;k_B:\texttt{Boltzmann&space;constant}$

$\Rightarrow$

Information entropy

$H=-\sum_{i=1}^{N}p(x_i)lnp(x_i)$

$\Rightarrow$

KL Divergence

$D_{KL}(p||q)=\sum_{i=1}^{N}p(x_i)\cdot&space;(ln\&space;p(x_i)-ln\&space;q(x_i))&space;=\sum_{i=1}^{N}p(x_i)\cdot&space;ln\frac{p(x_i)}{q(x_i)}$

Sidebar