Next:Dimensional Unified Divergence MeasuresUp:Dimensional Divergence Measures
Previous:Introduction Go to:Table of Contents

M-Dimensional Divergence Measures


Let $ P_1,P_2,...,P_M \ \varepsilon \ \Delta_n$ be$ M-$probability distributions. Let $ (\lambda_1,\lambda_2,$$ ...,$$ \lambda_M)$$ \ \varepsilon \ \Delta_M$. It is well known that the Shannon's entropy satisfy the following inequality:

$\displaystyle \sum_{j=1}^M{\lambda_j \ H(P_j)} \leqH\Big(\sum_{j=1}^M{\lambda_jP_j}\Big),$
    (5.1)

with equality iff $ P_1=P_2=...=P_M$.

Also due to property 1.14, we have

$\displaystyle H\Big(\sum_{j=1}^M{\lambda_jP_j}\Big) \leqH\Big(\sum_{j=1}^M{\lambda_jP_j}\vert\vert P_k\Big),$
    (5.2)

for each k=1,2,...,M with equality iff $ P_1=P_2=...=P_M$.

We can easily check that

$\displaystyle H\Big(\sum_{j=1}^M{\lambda_jP_j}\vert\vert P_k\Big) =\sum_{j=1}^M{\lambda_j \ H(P_j\vert\vert P_k)},\ \forall\, k=1,2,...,M$
    (5.3)

Multiplying both sides of (5.2) by $ \lambda_k$, summing over k=1,2,...,M and using (5.3), we have

$\displaystyle H\Big(\sum_{j=1}^M{\lambda_jP_j}\Big)\leq\sum_{j=1}^M{\sum_{k=1}^M{\lambda_j\lambda_k\,H\Big(P_j\vert\vert P_k\Big)}},$
    (5.4)

with equality iff $ P_1=P_2=...=P_M$.

From (5.1) and (5.4), we have

$\displaystyle \sum_{j=1}^M{\lambda_j \ H(P_j)} \leqH\Big(\sum_{j=1}^M{\lambda_......\sum_{j=1}^M{\sum_{k=1}^M{\lambda_j\lambda_k\,H\Big(P_j\vert\vert P_k\Big)}},$
    (5.5)

with equality iff $ P_1=P_2=...=P_M$.

The above inequality (5.5) admits the following three nonnegative differences given by

$\displaystyle J(P_1,P_2,...,P_M)=\sum_{j=1}^M{\sum_{k=1}^M{\lambda_j\lambda_k\H\Big( P_j\vert\vert P_k\Big)}}-\sum_{j=1}^M{\lambda_j\ H(P_j)}$
$\displaystyle = {\sum_{j,k=1}^M}_{j\neq k}{\lambda_j\lambda_k\D\Big(P_j\vert\......p_{ji}\log \Biggl({p_{ji}\over \prod_{k=1}^M{ {p_{ki}}^{\lambda_k}}}\Biggr)},$
    (5.6)

$\displaystyle I(P_1,P_2,...,P_M)=H\Big(\sum_{j=1}^M{\lambda_jP_j}\Big)-\sum_{j=1}^M{\lambda_jH(P_j)}$
$\displaystyle = \sum_{j=1}^M{ \sum_{k=1}^M{ \lambda_j\lambda_k\D\Big(P_j\vert......a_j p_{ji}\log\Biggl({p_{ji}\over\sum_{k=1}^M{ \lambda_k p_{ki} } }\Biggr) }},$
    (5.7)

and 

$\displaystyle T(P_1,P_2,...,P_M)=\sum_{j=1}^M{\sum_{k=1}^M{\lambda_j\lambda_k\ H\Big(P_j\vert\vert P_k\Big)}}-H\Big(\sum_{j=1}^M{\lambda_jP_j}\Big) $
$\displaystyle = \sum_{i=1}^n{\sum_{j=1}^M{\lambda_jp_{ji}\log\Biggl({\sum_{k=1}^M{\lambda_kp_{ki}}\over\prod_{k=1}^M{ \lambda_k p_{ki} } }\Biggr) }},$
    (5.8)

From (5.6), (5.7) and (5.8), we conclude that

$\displaystyle I(P_1,P_2,...,P_M)+T(P_1,P_2,...,P_M)=J(P_1,P_2,...,P_M).$
    (5.9)

From (5.9), we have the following inequalities:

$\displaystyle J(P_1,P_2,...,P_M)\geq I(P_1,P_2,...,P_M),$
    (5.10)

and

$\displaystyle J(P_1,P_2,...,P_M)\geq T(P_1,P_2,...,P_M),$
    (5.11)

The measures (5.6), (5.7) and (5.8) in particular reduce to (2.6) and (2.7) respectively, when M=2, $ P_1=P$ and$ Q_1=Q$ with a multiplicative constant.

Note 5.1.

(i) The measure (5.6) is a $ M-$dimensional generalization of $ J-$divergence and is studied for the first time by Toussaint (1978) [110]. The measure (5.7) is $ M-$dimensional generalization of $ I-$divergence (2.8) and is studied for the first time by Burbea and Rao (1982a) [18]. While the measure (5.8) is new. It invloves the expressions appearing in the arithmetic and geometric mean inequality given by
$\displaystyle \sum_{k=1}^M{\lambda_kp_{ki}}\geq \prod{{p_{ki}}^{\lambda_k}}.$
(ii) We call the measure (5.8) the arithmetic and geometric mean divergence measure or simply a $ T-$divergence. It has been studied by Taneja (1995) [108].

21-06-2001
Inder Jeet Taneja
Departamento de Matemática - UFSC
88.040-900 Florianópolis, SC - Brazil