Next:Information
Measures for DiscreteUp:Shannon's
Entropy
Previous:Multivariate
Entropies Go to:Table
of Contents
Mutual Information
Let us consider the following differences
and
In view of the property 1.40, the following property holds.
Property 1.47. We have
-
(i)
with equality iff
and
are independent.
-
(ii)
with equality iff
and
are conditionally independent given
.
-
(iii)
with equality iff
and
are independent.
The above measures are famous in the literature as
: mutual information among the random variables
and
;
: mutual information among the random variables
and
given
;
: mutual information among the random variables
and
.
We can write
where
and
In view of the above representation,
sometimes understood as the average or the conditional mutual
information of
or of
.
The expression
is known as per-letter information, and it may be negative. It is
nonnegative iff
Similarly, we can write expressions for
and
.
In fact, the following properties hold.
Property 1.48. We have
-
(i)
![$ E_X \big[ I(X=x_i\wedge Y)\big] = E_Y\big[ I(X\wedge Y=y_j)\big] .$](img266.gif)
-
(ii)
![$ E_{YZ} \big[ I(X\wedge Y=y_j\vert Z=z_k)\big] = E_{XZ}\big[I(X=x_i\wedge Y\vert Z=z_k)\big] .$](img267.gif)
-
(iii)
![$ E_Z \big[ I\big((X,Y)\wedge Z=z_k\big)\big] = E_{XY}\big[ I\big((X=x_i,Y=y_j)\wedge Z\big) \big] .$](img268.gif)
Property 1.49. We have
-
(i)

-
(ii)

-
(iii)

Property 1.50. We have
-
(i)
is a convex
function of the probability distribution
.
-
(ii)
is a convex
function of the probability distribution
.
Property 1.51. We have
-
(i)
,
with equality iff
(or resp.
)
and
are conditionally independent given Y.
-
(ii)
,
with equality iff
(or resp.
)
and
are independent.
Property 1.52. If
and
are conditionally independent given
,
then
-
(i)

-
(ii)

Property 1.53. We have
-
(i)

-
(ii)

-
(iii)

-
(iv)

21-06-2001
Inder Jeet Taneja
Departamento de Matemática - UFSC
88.040-900 Florianópolis, SC - Brazil