Next:Information
Measures for DiscreteUp:Shannon's
Entropy
Previous:Multivariate
Entropies Go to:Table
of Contents
Mutual Information
Let us consider the following differences
and
In view of the property 1.40, the following property holds.
Property 1.47. We have
-
(i)
with equality iff
and
are independent.
-
(ii)
with equality iff
and
are conditionally independent given .
-
(iii)
with equality iff
and
are independent.
The above measures are famous in the literature as
: mutual information among the random variables
and ;
: mutual information among the random variables
and
given ;
: mutual information among the random variables
and .
We can write
where
and
In view of the above representation,
sometimes understood as the average or the conditional mutual
information of
or of.
The expression
is known as per-letter information, and it may be negative. It is
nonnegative iff
Similarly, we can write expressions for
and.
In fact, the following properties hold.
Property 1.48. We have
-
(i)
-
(ii)
-
(iii)
Property 1.49. We have
-
(i)
-
(ii)
-
(iii)
Property 1.50. We have
-
(i)
is a convex
function of the probability distribution .
-
(ii)
is a convex
function of the probability distribution .
Property 1.51. We have
-
(i) ,
with equality iff
(or resp. )
and
are conditionally independent given Y.
-
(ii) ,
with equality iff
(or resp. )
and
are independent.
Property 1.52. If
and
are conditionally independent given ,
then
-
(i)
-
(ii)
Property 1.53. We have
-
(i)
-
(ii)
-
(iii)
-
(iv)
21-06-2001
Inder Jeet Taneja
Departamento de Matemática - UFSC
88.040-900 Florianópolis, SC - Brazil