TY - JOUR
T1 - An accelerated minimal gradient method with momentum for strictly convex quadratic optimization
AU - Oviedo, Harry
AU - Dalmau, Oscar
AU - Herrera, Rafael
N1 - Publisher Copyright:
© 2021, The Author(s), under exclusive licence to Springer Nature B.V.
PY - 2022/6
Y1 - 2022/6
N2 - In this article we address the problem of minimizing a strictly convex quadratic function using a novel iterative method. The new algorithm is based on the well-known Nesterov’s accelerated gradient method. At each iteration of our scheme, the new point is computed by performing a line-search scheme using a search direction given by a linear combination of three terms, whose parameters are chosen so that the residual norm is minimized at each step of the process. We establish the linear convergence of the proposed method and show that its convergence rate factor is analogous to the one available for other gradient methods. Finally, we present preliminary numerical results on some sets of synthetic and real strictly convex quadratic problems, showing that the proposed method outperforms in terms of efficiency, a wide collection of state-of-the art gradient methods, and that it is competitive against the conjugate gradient method in terms of CPU time and number of iterations.
AB - In this article we address the problem of minimizing a strictly convex quadratic function using a novel iterative method. The new algorithm is based on the well-known Nesterov’s accelerated gradient method. At each iteration of our scheme, the new point is computed by performing a line-search scheme using a search direction given by a linear combination of three terms, whose parameters are chosen so that the residual norm is minimized at each step of the process. We establish the linear convergence of the proposed method and show that its convergence rate factor is analogous to the one available for other gradient methods. Finally, we present preliminary numerical results on some sets of synthetic and real strictly convex quadratic problems, showing that the proposed method outperforms in terms of efficiency, a wide collection of state-of-the art gradient methods, and that it is competitive against the conjugate gradient method in terms of CPU time and number of iterations.
KW - Accelerated gradient methods
KW - Convex optimization
KW - Gradient method
KW - Linear system of equations
UR - http://www.scopus.com/inward/record.url?scp=85111772253&partnerID=8YFLogxK
U2 - 10.1007/s10543-021-00886-9
DO - 10.1007/s10543-021-00886-9
M3 - Article
AN - SCOPUS:85111772253
SN - 0006-3835
VL - 62
SP - 591
EP - 606
JO - BIT Numerical Mathematics
JF - BIT Numerical Mathematics
IS - 2
ER -