In this section we extend the notion of generalized entropies to
the continuous case, and examine some properties of the resulting unified entropy
function.
Let X be an absolutely continuous random variable, that is, a random
variable having a probability density function p(x). The unified entropy
of X is defined as follows:
![]() |
(3.14) |
provided the integrals exist, where ,
.
The contrast between continuous and discrete distribution is worth emphasising:
Example 3.2. Let
be a discrete random variable taking the values
with equal probabilities
.
Then
Example 3.3. We consider a function ,
where
is a stricly increasing function of
.
Since the mapping from
to
is one to one, we have
These important differences between discrete and continuous cases are
a warning that the results for the discrete distributions cannot be translated
to continuous case without independent verification. Fortunately, some
of the significant concepts rely upon differences between entropies and
for these the difficulties disappear.