Please use this identifier to cite or link to this item:
Klein, Ingo
Mangold, Benedikt
Year of Publication: 
Series/Report no.: 
IWQW Discussion Paper Series 07/2015
A new kind of entropy will be introduced generalizing both the differential entropy and the cumulative (residual) entropy. The generalization is twofold. Firstly, we define the entropy for cumulative distribution functions (cdf) and survivor functions (sf) simultaneously instead of densities, cdf or sf alone. Secondly, we consider a general 'entropy generating function' Phi like Burbea & Rao (1982) or Liese & Vajda (1987) in the context of phi-divergences. Combining the ideas of a phi-entropy and a cumulative entropy gives the new 'cumulative paired phi-entropy' (CPE ). With some modifications or simplifications this new entropy has already been discussed in at least four scientific disciplines. In the fuzzy set theory cumulative paired phi- entropies were defined for membership functions. A discrete version serves as a measure of dispersion for ordered categorial variables. More recently, uncertainty and reliability theory considered some variants as a measure of information. With only one exception the discussions seem to happen independently of each other. We consider CPEphi only for continuous cdf and show that CPE phi is rather a measure of dispersion than a measure of information. At first, this will be demonstrated by deriving an upper bound which is determined by the standard deviation and by solving the maximum entropy problem under the restriction that the variance is fixed. We cannot only reproduce the central role of the logistic distribution in entropy maximization. We derive Tukey's Lambda distribution as the solution of an entropy maximization problem as well. Secondly, it will be shown explicitly that CPEphi fulfills the axioms of a dispersion measure. The corresponding dispersion functional can easily be estimated by an L-estimator with all its known asymptotical properties. CPEphi are the starting point for several related concepts like mutual phi-information, phi -correlation and phi-regression which generalize Gini correlation and Gini regression. We give a short introduction into all of these related concepts. Also linear rank tests for scale can be developed based on the new entropy. We show that almost all known tests are special cases and introduce some new tests. In the literature Shannon's differential entropy has been calculated for a lot of distributions. The formulas were presented explicitly. We have done the same for CPEphi if the cdf is available in a closed form.
differential entropy
absolute mean deviation
cumulative residual entropy
cumulative entropy
measure of dispersion
measure of polarization
generalized maximum entropy principle
Tukey's lambda distribution
power logistic distribution
linear rank test
Document Type: 
Working Paper

Files in This Item:
581.95 kB

Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.