Bitte verwenden Sie diesen Link, um diese Publikation zu zitieren, oder auf sie als Internetquelle zu verweisen: https://hdl.handle.net/10419/36620 
Erscheinungsjahr: 
2008
Schriftenreihe/Nr.: 
Technical Report No. 2008,25
Verlag: 
Technische Universität Dortmund, Sonderforschungsbereich 475 - Komplexitätsreduktion in Multivariaten Datenstrukturen, Dortmund
Zusammenfassung: 
The computation of robust regression estimates often relies on minimization of a convex functional on a convex set. In this paper we discuss a general technique for a large class of convex functionals to compute the minimizers iteratively which is closely related to majorization-minimization algorithms. Our approach is based on a quadratic approximation of the functional to be minimized and includes the iteratively reweighted least squares algorithm as a special case. We prove convergence on convex function spaces for general coercive and convex functionals F and derive geometric convergence in certain unconstrained settings. The algorithm is applied to TV penalized quantile regression and is compared with a step size corrected Newton-Raphson algorithm. It is found that typically in the first steps the iteratively reweighted least squares algorithm performs significantly better, whereas the Newton type method outpaces the former only after many iterations. Finally, in the setting of bivariate regression with unimodality constraints we illustrate how this algorithm allows to utilize highly efficient algorithms for special quadratic programs in more complex settings.
Schlagwörter: 
regression analysis
monotone regression
quantile regression
shape constraints
L1 regression
nonparametric regression
total variation semi-norm
reweighted least squares
Fermat's problem
convex approximation
quadratic approximation
pool adjacent violators algorithm
Dokumentart: 
Working Paper

Datei(en):
Datei
Größe
281.07 kB





Publikationen in EconStor sind urheberrechtlich geschützt.