TY - JOUR
T1 - A hybrid gradient method for strictly convex quadratic programming
AU - Oviedo, Harry
AU - Dalmau, Oscar
AU - Herrera, Rafael
N1 - Funding Information:
This work was supported in part by CONACYT (Mexico), Grants 258033 and 256126. H. Oviedo thanks PhD. Marcos Raydan for providing pertinent information and also for their constructive suggestions.
Publisher Copyright:
© 2020 John Wiley & Sons, Ltd.
PY - 2021/8
Y1 - 2021/8
N2 - In this article, we present a reliable hybrid algorithm for solving convex quadratic minimization problems. At the kth iteration, two points are computed: first, an auxiliary point (Formula presented.) is generated by performing a gradient step using an optimal steplength, and second, the next iterate xk + 1 is obtained by means of weighted sum of (Formula presented.) with the penultimate iterate xk − 1. The coefficient of the linear combination is computed by minimizing the residual norm along the line determined by the previous points. In particular, we adopt an optimal, nondelayed steplength in the first step and then use a smoothing technique to impose a delay on the scheme. Under a modest assumption, we show that our algorithm is Q-linearly convergent to the unique solution of the problem. Finally, we report numerical experiments on strictly convex quadratic problems, showing that the proposed method is competitive in terms of CPU time and iterations with the conjugate gradient method.
AB - In this article, we present a reliable hybrid algorithm for solving convex quadratic minimization problems. At the kth iteration, two points are computed: first, an auxiliary point (Formula presented.) is generated by performing a gradient step using an optimal steplength, and second, the next iterate xk + 1 is obtained by means of weighted sum of (Formula presented.) with the penultimate iterate xk − 1. The coefficient of the linear combination is computed by minimizing the residual norm along the line determined by the previous points. In particular, we adopt an optimal, nondelayed steplength in the first step and then use a smoothing technique to impose a delay on the scheme. Under a modest assumption, we show that our algorithm is Q-linearly convergent to the unique solution of the problem. Finally, we report numerical experiments on strictly convex quadratic problems, showing that the proposed method is competitive in terms of CPU time and iterations with the conjugate gradient method.
KW - convex quadratic optimization
KW - gradient methods
KW - linear system of equations
UR - http://www.scopus.com/inward/record.url?scp=85097840635&partnerID=8YFLogxK
U2 - 10.1002/nla.2360
DO - 10.1002/nla.2360
M3 - Article
AN - SCOPUS:85097840635
SN - 1070-5325
VL - 28
JO - Numerical Linear Algebra with Applications
JF - Numerical Linear Algebra with Applications
IS - 4
M1 - e2360
ER -