Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/305207 
Year of Publication: 
2022
Citation: 
[Journal:] Computational Optimization and Applications [ISSN:] 1573-2894 [Volume:] 82 [Issue:] 2 [Publisher:] Springer US [Place:] New York, NY [Year:] 2022 [Pages:] 465-498
Publisher: 
Springer US, New York, NY
Abstract: 
We develop a globalized Proximal Newton method for composite and possibly non-convex minimization problems in Hilbert spaces. Additionally, we impose less restrictive assumptions on the composite objective functional considering differentiability and convexity than in existing theory. As far as differentiability of the smooth part of the objective function is concerned, we introduce the notion of second order semi-smoothness and discuss why it constitutes an adequate framework for our Proximal Newton method. However, both global convergence as well as local acceleration still pertain to hold in our scenario. Eventually, the convergence properties of our algorithm are displayed by solving a toy model problem in function space.
Subjects: 
Non-smooth Optimization
Optimization in Hilbert space
Proximal Newton
JEL: 
M15
M37
Persistent Identifier of the first edition: 
Creative Commons License: 
cc-by Logo
Document Type: 
Article
Document Version: 
Published Version

Files in This Item:
File
Size





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.