Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/36620 
Year of Publication: 
2008
Series/Report no.: 
Technical Report No. 2008,25
Publisher: 
Technische Universität Dortmund, Sonderforschungsbereich 475 - Komplexitätsreduktion in Multivariaten Datenstrukturen, Dortmund
Abstract: 
The computation of robust regression estimates often relies on minimization of a convex functional on a convex set. In this paper we discuss a general technique for a large class of convex functionals to compute the minimizers iteratively which is closely related to majorization-minimization algorithms. Our approach is based on a quadratic approximation of the functional to be minimized and includes the iteratively reweighted least squares algorithm as a special case. We prove convergence on convex function spaces for general coercive and convex functionals F and derive geometric convergence in certain unconstrained settings. The algorithm is applied to TV penalized quantile regression and is compared with a step size corrected Newton-Raphson algorithm. It is found that typically in the first steps the iteratively reweighted least squares algorithm performs significantly better, whereas the Newton type method outpaces the former only after many iterations. Finally, in the setting of bivariate regression with unimodality constraints we illustrate how this algorithm allows to utilize highly efficient algorithms for special quadratic programs in more complex settings.
Subjects: 
regression analysis
monotone regression
quantile regression
shape constraints
L1 regression
nonparametric regression
total variation semi-norm
reweighted least squares
Fermat's problem
convex approximation
quadratic approximation
pool adjacent violators algorithm
Document Type: 
Working Paper

Files in This Item:
File
Size
281.07 kB





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.