This paper reports two proposals of possible preconditioners for the Nonlinear Conjugate Gradient (NCG) method, in large scale unconstrained optimization. On one hand, the common idea of our preconditioners is inspired to L-BFGS quasi–Newton updates, on the other hand we aim at explicitly approximating in some sense the inverse of the Hessian matrix. Since we deal with large scale optimization problems, we propose matrix–free approaches where the preconditioners are built using symmetric low–rank updating formulae. Our distinctive new contributions rely on using information on the objective function collected as by-product of the NCG, at previous iterations. Broadly speaking, our first approach exploits the secant equation, in order to impose interpolation conditions on the objective function. In the second proposal we adopt and ad hoc modified–secant approach, in order to possibly guarantee some additional theoretical properties.
|Titolo:||Preconditioning strategies for Nonlinear Conjugate Gradient methods, based on Quasi-Newton updates|
|Data di pubblicazione:||2016|
|Appare nelle tipologie:||4.1 Articolo in Atti di convegno|