Kullback-Leibler Divergence
The Kullback-Leibler Divergence (also called relative entropy), denoted , is a type of statistical distance, measuring how much an approximating probability distribution is different from a true probability distribution .
Definition
For discrete probability distributions and defined on a sample space , the relative entropy from to is defined to be
For distributions and of a continuous random variable with probability density functions and respectively, relative entropy is defined to be
In other words, the KL divergence is the expected excess suprise from using the approximation instead of when the actual is . Note that it is not an actual metric, since it is not symmetric.
Often, represents the data, the observations, or a measured probability distribution, and represents a model.