Next:First
GeneralizationsUp:Generalized
Information and Divergence
Previous:Properties
of Unified Inaccuracies Go to:Table
of Contents
Unified (r,s)-Divergence
Measures
In section 2.4 we studied the divergence measures known as Jensen difference
divergence measure (or information radius) i.e. divergence
and Jeffrey-Kullback-Leibler's divergence.
Here our aim is to present different generalizations of these two measures
having two scalar parameters. Their extension to class
case is given in Chapter 5.
We see that the
and divergence
measures given respectively in (2.9) and (2.7) depend on the relative information, .
Based on the unified expression
and the expressions (2.9) and (2.7) we can generalize the
and divergence
measures. This we have done in the first generalization. An alternative
approach to generalize the
and divergence
measures is also given, and it is based on the expressions appearing in
the particular case of the first generalizations.
Subsections
21-06-2001
Inder Jeet Taneja
Departamento de Matemática - UFSC
88.040-900 Florianópolis, SC - Brazil