Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/169214 
Year of Publication: 
2017
Series/Report no.: 
SFB 649 Discussion Paper No. 2017-024
Publisher: 
Humboldt University of Berlin, Collaborative Research Center 649 - Economic Risk, Berlin
Abstract: 
This paper considers smooth principle component analysis for high dimensional data with very large dimensional observations p and moderate number of individuals N. Our setting is similar to traditional PCA, but we assume the factors are smooth and design a new approach to estimate them. By connecting with Singular Value Decomposition subjected to penalized smoothing, our algorithm is linear in the dimensionality of the data, and it also favors block calculations and sequential access to memory. Different from most existing methods, we avoid extracting eignefunctions via smoothing a huge dimensional covariance operator. Under regularity assumptions, the results indicate that we may enjoy faster convergence rate by employing smoothness assumption. We also extend our methods when each subject is given multiple tasks by adopting the two way ANOVA approach to further demonstrate the advantages of our approach.
Subjects: 
Principal Component Analysis
Penalized Smoothing
Asymp- totics
Multilevel
fMRI
Document Type: 
Working Paper

Files in This Item:
File
Size





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.