Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/113246 
Is replaced by the following version: 
Title: 

Cumulative Paired 𝜙-Entropy

The document was removed on behalf of the author(s)/ the editor(s).

Year of Publication: 
2015
Series/Report no.: 
IWQW Discussion Papers No. 07/2015
Publisher: 
Friedrich-Alexander-Universität Erlangen-Nürnberg, Institut für Wirtschaftspolitik und Quantitative Wirtschaftsforschung (IWQW), Nürnberg
Abstract: 
A new kind of entropy will be introduced generalizing both the differential entropy and the cumulative (residual) entropy. The generalization is twofold. Firstly, we define the entropy for cumulative distribution functions (cdf) and survivor functions (sf) simultaneously instead of densities, cdf or sf alone. Secondly, we consider a general 'entropy generating function' 𝜙 like Burbea & Rao (1982) or Liese & Vajda (1987) in the context of 𝜙-divergences. Combining the ideas of a 𝜙-entropy and a cumulative entropy gives the new 'cumulative paired 𝜙-entropy' (CPE𝜙). With some modifications or simplifications this new entropy has already been discussed in at least four scientific disciplines. In the fuzzy set theory cumulative paired 𝜙- entropies were defined for membership functions. A discrete version serves as a measure of dispersion for ordered categorial variables. More recently, uncertainty and reliability theory considered some variants as a measure of information. With only one exception the discussions seem to happen independently of each other. We consider CPE𝜙 only for continuous cdf and show that CPE 𝜙 is rather a measure of dispersion than a measure of information. At first, this will be demonstrated by deriving an upper bound which is determined by the standard deviation and by solving the maximum entropy problem under the restriction that the variance is fixed. We cannot only reproduce the central role of the logistic distribution in entropy maximization. We derive Tukey's Lambda distribution as the solution of an entropy maximization problem as well. Secondly, it will be shown explicitly that CPE𝜙 fulfills the axioms of a dispersion measure. The corresponding dispersion functional can easily be estimated by an L-estimator with all its known asymptotical properties. CPE𝜙 are the starting point for several related concepts like mutual 𝜙-information, 𝜙-correlation and 𝜙-regression which generalize Gini correlation and Gini regression. We give a short introduction into all of these related concepts. Also linear rank tests for scale can be developed based on the new entropy. We show that almost all known tests are special cases and introduce some new tests. In the literature Shannon's differential entropy has been calculated for a lot of distributions. The formulas were presented explicitly. We have done the same for CPE𝜙 if the cdf is available in a closed form.
Subjects: 
𝜙-entropy
differential entropy
absolute mean deviation
cumulative residual entropy
cumulative entropy
measure of dispersion
measure of polarization
generalized maximum entropy principle
Tukey's λ distribution
power logistic distribution
𝜙-dependence
𝜙-regression
L-estimator
linear rank test
Document Type: 
Working Paper

Files in This Item:
The document was removed on behalf of the author(s)/ the editor(s) on: November 30, 2016
There are no files associated with this item.


Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.