Basic examples edit

</ref>[1]

Kullback [1] gives the following example (Table 2.1, Example 2.1). Let P and Q be the distributions shown in the table and figure. P is the distribution on the left side of the figure, a binomial distribution with N = 2 and p = 0.4. Q is the distribution on the right side of the figure, a discrete uniform distribution with the three possible outcomes x = 0, 1, or 2, each with probability p = 1/3.


 


0 1 2
Distribution P 0.36 0.48 0.16
Distribution Q 0.333 0.333 0.333

The KL divergence is calculated as follows. This example uses the natural log with base e, designated ln().

 
 
 
 
  1. ^ a b Kullback, S. (1959), Information Theory and Statistics, John Wiley & Sons. Republished by Dover Publications in 1968; reprinted in 1978: ISBN 0-8446-5625-9.