A New Hybrid of DY and CGSD Conjugate Gradient Methods for Solving Unconstrained Optimization Problems
DOI:
https://doi.org/10.25130/tjps.v26i5.183Abstract
In this article, we present a new hybrid conjugate gradient method for solving large Scale in unconstrained optimization problems. This method is a convex combination of Dai-Yuan conjugate gradient and Andrei- sufficient descent condition, satisfies the famous D-L conjugacy condition and in the same time solidarities with the newton direction with the suitable condition. The suggestion method always yields a descent search direction at each it iteration. Under strong wolfe powell(SWP) line search condition, the direction satisfy the global convergence of the proposed method is established. Finally, the results we achieved are good and it is show that our method is forceful and effective.
Downloads
Published
How to Cite
License
Copyright (c) 2022 Tikrit Journal of Pure Science
This work is licensed under a Creative Commons Attribution 4.0 International License.
Tikrit Journal of Pure Science is licensed under the Creative Commons Attribution 4.0 International License, which allows users to copy, create extracts, abstracts, and new works from the article, alter and revise the article, and make commercial use of the article (including reuse and/or resale of the article by commercial entities), provided the user gives appropriate credit (with a link to the formal publication through the relevant DOI), provides a link to the license, indicates if changes were made, and the licensor is not represented as endorsing the use made of the work. The authors hold the copyright for their published work on the Tikrit J. Pure Sci. website, while Tikrit J. Pure Sci. is responsible for appreciate citation of their work, which is released under CC-BY-4.0, enabling the unrestricted use, distribution, and reproduction of an article in any medium, provided that the original work is properly cited.