Next:Entropy of DegreeUp:Characterizations of Generalized Entropies
Previous:Characterizations of Generalized Entropies Go to:Table of Contents

Entropy of Order r


The following characterization is due to Rényi (1961) [82]. Let us consider the following postulates defined for the function $ H(P) \in \delta_n$, where

$\displaystyle \delta_n=\Big\{P=(p_1,p_2,...,p_n)\vert p_i\geq0,\\sum_{i=1}^n{p_i}\leq 1\Big\}$
    (3.15)
(i) H(P) is a symmetric function of the elements of P.
(ii) If $ \{p\}$ denotes the generalized probability distribution consisting of the single probability $ \{p\}$ that $ H(\{p\})$ is a continuous function of p in the interval $ 0 < p \leq 1$.
(iii) $ H(\{{1\over 2}\}) =1$.
(iv) For $ P \in \delta_n$$ Q \in \delta_m$ and$ P*Q \in \delta_{nm}$, we have 
$\displaystyle H(P*Q)=H(P)+H(Q).$
Before stating the last postulate, we introduce some notations. Let $ P=(p_1,p_2,...,p_n) \in \delta_n$, and $ Q=(q_1,q_2,...,q_m)\in \delta_m$ be two generalized probability distributions such that $ w(P)+w(Q)\leq 1$, we have 
$\displaystyle H(P\cupQ)=g^{-1}\Big({w(P)g(H(P))+w(Q)g(H(Q))\over w(P)+w(Q)}\Big),$
with $ w(P) = \sum_{i=1}^n{p_i}\leq 1,$ where $ g$ is strictly monotonic function.

Then 

$\displaystyle H(P)=H_r(P)=(1-r)^{-1}\log\Big(\sum_{i=1}^n{p^r_i}/\sum_{i=1}^n{p_i}\Big),\ r\neq 1,\ r>0.$
Later Daróczy (1963; 1964) [31], [32] reformed the above axiomatic system. Based on the same motivations of Rényi, later researchers (Aczél and Daróczy, 1963 [1]; Varma,1966 [119]; Kapur, 1967 [54] and Rathie, 1970 [78]) generalized the entropy of order $ r$ having more than one parameter. These generalizations are specified in the list of entropies (see section 3.5.2).
 

2001-06-21