Rényi (1961) [82] first presented
a scalar parametric generalization of Kullback-Leibler's relative information
as
Rathie and Kannappan (1972) [79] presented an alternative way to generalize Kullback-Leibler's information as
Sharma and Mittal (1977) [91] studied
one and two scalar parametric generalizations of
as
Instead studying the measures ,
,
,
and
separately, one can study them jointly. Let us write these measures in
the unified expression as in case of entropy-type in the following way:
![]() |
(4.1) |
for all
and
.
The measure
does not appear in the unified expression because it is already contained
in
as a particular case when
.
We call the measure ,
the unified
relative
information.
Note 4.1. The definition of
and
given initially by Sharma and Mittal (1977) [91]
involve
.
But, here in our study we have relaxed this condition. The constant initially
considered was
.
Also, we have changed the multiplicative constant and took it as
in order to simplify our study on generalized divergence measures. Some
study on the measures
,
and
can be seen in Mathai and Rathie (1975). [71]
In particular, we have
![]() |
(4.2) |
for all ,
,
where
is the unified
entropy
given in Chapter 3, expression (3.8).
As in case of generalized entropies, here also we can write
![]() |
(4.3) |
where
(reals) is given by
![]() |
(4.4) |
Proposition 4.1. The measure
given by (4.4) has the following properties.