Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/31010 
Authors: 
Year of Publication: 
2005
Series/Report no.: 
Discussion Paper No. 419
Publisher: 
Ludwig-Maximilians-Universität München, Sonderforschungsbereich 386 - Statistische Analyse diskreter Strukturen, München
Abstract: 
Classification trees based on imprecise probabilities provide an advancement of classical classification trees. The Gini Index is the default splitting criterion in classical classification trees, while in classification trees based on imprecise probabilities, an extension of the Shannon entropy has been introduced as the splitting criterion. However, the use of these empirical entropy measures as split selection criteria can lead to a bias in variable selection, such that variables are preferred for features other than their information content. This bias is not eliminated by the imprecise probability approach. The source of variable selection bias for the estimated Shannon entropy, as well as possible corrections, are outlined. The variable selection performance of the biased and corrected estimators are evaluated in a simulation study. Additional results from research on variable selection bias in classical classification trees are incorporated, implying further investigation of alternative split selection criteria in classification trees based on imprecise probabilities. Keywords. Classification trees ; credal classification ; variable selection bias ; attribute selection error ; Shannon entropy ; entropy estimation
Persistent Identifier of the first edition: 
Document Type: 
Working Paper

Files in This Item:
File
Size
141.03 kB





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.