Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/312463 
Year of Publication: 
2023
Citation: 
[Journal:] Computational Optimization and Applications [ISSN:] 1573-2894 [Volume:] 87 [Issue:] 3 [Publisher:] Springer US [Place:] New York, NY [Year:] 2023 [Pages:] 977-1008
Publisher: 
Springer US, New York, NY
Abstract: 
In this contribution, we present a numerical analysis of the continuous stochastic gradient (CSG) method, including applications from topology optimization and convergence rates. In contrast to standard stochastic gradient optimization schemes, CSG does not discard old gradient samples from previous iterations. Instead, design dependent integration weights are calculated to form a convex combination as an approximation to the true gradient at the current design. As the approximation error vanishes in the course of the iterations, CSG represents a hybrid approach, starting off like a purely stochastic method and behaving like a full gradient scheme in the limit. In this work, the efficiency of CSG is demonstrated for practically relevant applications from topology optimization. These settings are characterized by both, a large number of optimization variables and an objective function, whose evaluation requires the numerical computation of multiple integrals concatenated in a nonlinear fashion. Such problems could not be solved by any existing optimization method before. Lastly, with regards to convergence rates, first estimates are provided and confirmed with the help of numerical experiments.
Subjects: 
Stochastic gradient scheme
Convergence analysis
Step size rule
Backtracking line search
Constant step size
Persistent Identifier of the first edition: 
Creative Commons License: 
cc-by Logo
Document Type: 
Article
Document Version: 
Published Version

Files in This Item:
File
Size





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.