A hybrid gradient method for strictly convex quadratic programming

Harry Oviedo, Oscar Dalmau, Rafael Herrera

Research output: Contribution to journalArticlepeer-review

4 Scopus citations


In this article, we present a reliable hybrid algorithm for solving convex quadratic minimization problems. At the kth iteration, two points are computed: first, an auxiliary point (Formula presented.) is generated by performing a gradient step using an optimal steplength, and second, the next iterate xk + 1 is obtained by means of weighted sum of (Formula presented.) with the penultimate iterate xk − 1. The coefficient of the linear combination is computed by minimizing the residual norm along the line determined by the previous points. In particular, we adopt an optimal, nondelayed steplength in the first step and then use a smoothing technique to impose a delay on the scheme. Under a modest assumption, we show that our algorithm is Q-linearly convergent to the unique solution of the problem. Finally, we report numerical experiments on strictly convex quadratic problems, showing that the proposed method is competitive in terms of CPU time and iterations with the conjugate gradient method.

Original languageEnglish
Article numbere2360
JournalNumerical Linear Algebra with Applications
Issue number4
StatePublished - Aug 2021
Externally publishedYes


  • convex quadratic optimization
  • gradient methods
  • linear system of equations


Dive into the research topics of 'A hybrid gradient method for strictly convex quadratic programming'. Together they form a unique fingerprint.

Cite this