A delayed weighted gradient method for strictly convex quadratic minimization

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

In this paper is developed an accelerated version of the steepest descent method by a two-step iteration. The new algorithm uses information with delay to define the iterations. Specifically, in the first step, a prediction of the new test point is calculated by using the gradient method with the exact minimal gradient steplength and then, a correction is computed by a weighted sum between the prediction and the predecessor iterate of the current point. A convergence result is provided. In order to compare the efficiency and effectiveness of the proposal, with similar methods existing in the literature, numerical experiments are performed. The numerical comparison of the new algorithm with the classical conjugate gradient method shows that our method is a good alternative to solve large-scale problems.

Original languageEnglish
Pages (from-to)729-746
Number of pages18
JournalComputational Optimization and Applications
Volume74
Issue number3
DOIs
StatePublished - 1 Dec 2019
Externally publishedYes

Keywords

  • Convex quadratic optimization
  • Gradient methods
  • Linear system of equations

Fingerprint

Dive into the research topics of 'A delayed weighted gradient method for strictly convex quadratic minimization'. Together they form a unique fingerprint.

Cite this