A second-order gradient method for convex minimization

Research output: Contribution to journalArticlepeer-review


This work addresses the strictly convex unconstrained minimization problem via a modified version of the gradient method. The proposal is a line search method that uses a search direction based on the gradient method. This new direction is constructed by a mixture of the negative direction of the gradient with another particular direction that uses second-order information. Unlike Newton-type methods, our algorithm does not need to compute the inverse of the Hessian of the objective function. We analyze the global convergence under an exact line search. A numerical study is carried out, to illustrate the numerical effectiveness of the method by comparing it with some conjugate gradient methods and also with the Barzilai–Borwein gradient method both in quadratic and non-linear problems.

Original languageEnglish
Article number66
JournalBoletin de la Sociedad Matematica Mexicana
Issue number3
StatePublished - Nov 2021
Externally publishedYes


  • Convex quadratic optimization
  • Gradient methods
  • Hessian spectral properties
  • Steplength selection


Dive into the research topics of 'A second-order gradient method for convex minimization'. Together they form a unique fingerprint.

Cite this