Resumen
This paper presents a new adaptive steplength for the gradient method, which exploits the advantages the two steplengths proposed by Barzilai and Borwein. Particularly, the proposed steplength is based on an optimal step size determined by minimizing a merit function constructed as a convex combination of the cost function and its gradient norm. The global convergence and some theoretical properties related to the proposed gradient method are provided. Finally, some computational studies are included to highlight the efficiency and effectiveness of the new approach.
Idioma original | Inglés |
---|---|
Páginas (desde-hasta) | 873-885 |
Número de páginas | 13 |
Publicación | Ricerche di Matematica |
Volumen | 73 |
N.º | 2 |
DOI | |
Estado | Publicada - abr. 2024 |
Publicado de forma externa | Sí |