TY - JOUR
T1 - A family of optimal weighted conjugate-gradient-type methods for strictly convex quadratic minimization
AU - Oviedo, Harry
AU - Andreani, Roberto
AU - Raydan, Marcos
N1 - Publisher Copyright:
© 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2022/7
Y1 - 2022/7
N2 - We introduce a family of weighted conjugate-gradient-type methods, for strictly convex quadratic functions, whose parameters are determined by a minimization model based on a convex combination of the objective function and its gradient norm. This family includes the classical linear conjugate gradient method and the recently published delayed weighted gradient method as the extreme cases of the convex combination. The inner cases produce a merit function that offers a compromise between function-value reduction and stationarity which is convenient for real applications. We show that each one of the infinitely many members of the family exhibits q-linear convergence to the unique solution. Moreover, each one of them enjoys finite termination and an optimality property related to the combined merit function. In particular, we prove that if the n × n Hessian of the quadratic function has p < n different eigenvalues, then each member of the family obtains the unique global minimizer in exactly p iterations. Numerical results are presented that demonstrate that the proposed family is promising and exhibits a fast convergence behavior which motivates the use of preconditioning strategies, as well as its extension to the numerical solution of general unconstrained optimization problems.
AB - We introduce a family of weighted conjugate-gradient-type methods, for strictly convex quadratic functions, whose parameters are determined by a minimization model based on a convex combination of the objective function and its gradient norm. This family includes the classical linear conjugate gradient method and the recently published delayed weighted gradient method as the extreme cases of the convex combination. The inner cases produce a merit function that offers a compromise between function-value reduction and stationarity which is convenient for real applications. We show that each one of the infinitely many members of the family exhibits q-linear convergence to the unique solution. Moreover, each one of them enjoys finite termination and an optimality property related to the combined merit function. In particular, we prove that if the n × n Hessian of the quadratic function has p < n different eigenvalues, then each member of the family obtains the unique global minimizer in exactly p iterations. Numerical results are presented that demonstrate that the proposed family is promising and exhibits a fast convergence behavior which motivates the use of preconditioning strategies, as well as its extension to the numerical solution of general unconstrained optimization problems.
KW - Conjugate gradient methods
KW - Gradient methods
KW - Moreau envelope
KW - Strictly convex quadratics
KW - Unconstrained optimization
UR - http://www.scopus.com/inward/record.url?scp=85119826801&partnerID=8YFLogxK
U2 - 10.1007/s11075-021-01228-0
DO - 10.1007/s11075-021-01228-0
M3 - Article
AN - SCOPUS:85119826801
SN - 1017-1398
VL - 90
SP - 1225
EP - 1252
JO - Numerical Algorithms
JF - Numerical Algorithms
IS - 3
ER -