Abstract
This work addresses the strictly convex unconstrained minimization problem via a modified version of the gradient method. The proposal is a line search method that uses a search direction based on the gradient method. This new direction is constructed by a mixture of the negative direction of the gradient with another particular direction that uses second-order information. Unlike Newton-type methods, our algorithm does not need to compute the inverse of the Hessian of the objective function. We analyze the global convergence under an exact line search. A numerical study is carried out, to illustrate the numerical effectiveness of the method by comparing it with some conjugate gradient methods and also with the Barzilai–Borwein gradient method both in quadratic and non-linear problems.
| Original language | English |
|---|---|
| Article number | 66 |
| Journal | Boletin de la Sociedad Matematica Mexicana |
| Volume | 27 |
| Issue number | 3 |
| DOIs | |
| State | Published - Nov 2021 |
| Externally published | Yes |
Keywords
- Convex quadratic optimization
- Gradient methods
- Hessian spectral properties
- Steplength selection