Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/242421 
Year of Publication: 
2021
Series/Report no.: 
Beiträge zur Jahrestagung des Vereins für Socialpolitik 2021: Climate Economics
Publisher: 
ZBW - Leibniz Information Centre for Economics, Kiel, Hamburg
Abstract: 
Research on the measurement of uncertainty has a long tradition. Recently, the creation of the economic policy uncertainty index sparked a new wave of research on this topic. The index is based on major American newspapers with the use of manual labeling and counting of specific keywords. Several attempts of automating this procedure have been undertaken since, using Support Vector Machine and LDA analysis. The current paper takes these efforts one step further and offers an algorithm based on natural language processing and deep learning techniques for the quantification of economic policy uncertainty. The new approach allows an accurate distillation of the latent "uncertainty" underlying newspaper articles, enables an automated construction of a new index for the measurement of economic policy uncertainty, and improves on existing methods. The potential use of our new index extends to the areas of political uncertainty management, business cycle analysis, financial forecasting, and potentially, derivative pricing.
Subjects: 
Economic Policy Uncertainty
Deep Learning
Natural Language Processing
Text Data
Forecasting
JEL: 
E00
Document Type: 
Conference Paper

Files in This Item:
File
Size





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.