Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/129988 
Authors: 
Year of Publication: 
2015
Series/Report no.: 
School of Economics Discussion Papers No. 1504
Publisher: 
University of Kent, School of Economics, Canterbury
Abstract: 
The recent increase in the breath of computational methodologies has been matched with a corresponding increase in the difficulty of comparing the relative explanatory power of models from different methodological lineages. In order to help address this problem a universal information criterion (UIC) is developed that is analogous to the Akaike information criterion (AIC) in its theoretical derivation and yet can be applied to any model able to generate simulated or predicted data, regardless of its methodology. Both the AIC and proposed UIC rely on the Kullback-Leibler (KL) distance between model predictions and real data as a measure of prediction accuracy. Instead of using the maximum likelihood approach like the AIC, the proposed UIC relies instead on the literal interpretation of the KL distance as the inefficiency of compressing real data using modelled probabilities, and therefore uses the output of a universal compression algorithm to obtain an estimate of the KL distance. Several Monte Carlo tests are carried out in order to (a) confirm the performance of the algorithm and (b) evaluate the ability of the UIC to identify the true data-generating process from a set of alternative models.
Subjects: 
AIC
Minimum description length
Model selection
JEL: 
B41
C15
C52
C63
Document Type: 
Working Paper

Files in This Item:
File
Size
902.03 kB





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.