Kullback and Leibler (1951) [65] introduced
a measure of information associated with two probability distributions
of a discrete random variable. At the same time, they also developed the
idea of Jeffrey's (1946) [49] invariant.
Sibson (1969) [95] studied the idea of information
radius generally referred as Jensen difference divergence measure.
Taneja (1995) [108] presented a new divergence
measure referring arithmetic and geometric mean divergence measure.
On the other side Kerridge (1961) [63] studied
an expression similar to Shannon's entropy associated with two probability
distributions. This measure is generally referred as inaccuracy.
During past years researchers interested towards one and two scalar parametric
generalizations of the above four classical measures of information.
These four measures have found deep applications toward statistics while
the fifth one is new. Some continuous extensions are also studied. Some
divergence measures like, Bhattacharyya distance, variational distance
etc.. are also considered.
Unless otherwise specified, it is understood that , , .
If
for some ,
the corresponding
is also zero. All the logarithms are with base 2.