An Investigation Of A New Hybrid Conjugates Gradient Approach For Unconstrained Optimization Problems
DOI:
https://doi.org/10.62383/bilangan.v2i4.136Keywords:
Numerical optimization, Unconstrained objective function, Hybrid gradient methods, Global convergence, Numerical experimentAbstract
This work introduces a novel hybrid conjugate gradient (CG) technique for tackling unconstrained optimisation problems with improved efficiency and effectiveness. The parameter is computed as a convex combination of the standard conjugate gradient techniques using and . Our proposed method has shown that when using the strong Wolfe-line-search(SWC) under specific conditions, it achieves global theoretical convergence. In addition, the new hybrid CG approach has the ability to generate a search direction that moves downward with each iteration. The quantitative findings obtained by applying the recommended technique about 30 functions with varying dimensions clearly illustrate its effectiveness and potential.
Downloads
References
Al-Bayati, A.Y., Jameel, M.S.: New Scaled Proposed formulas For Conjugate Gradient Methods in Unconstrained Optimization. AL-Rafidain Journal of Computer Sciences and Mathematics. 11, 25–46 (2014)
Dai, Y.-H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM Journal on optimization. 10, 177–182 (1999)
Fathy, B.T., Younis, M.S.: Global Convergence Analysis of a new Hybrid Conjugate Gradient Method for Unconstraint Optimization Problems. In: Journal of Physics: Conference Series. p. 012063. IOP Publishing (2022)
Fletcher, R., Powell, M.J.D.: A rapidly convergent descent method for minimization. Comput J. 6, 163–168 (1963)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput J. 7, 149–154 (1964). https://doi.org/10.1093/comjnl/7.2.149
Hassan, B.A., Sadiq, H.M.: A new formula on the conjugate gradient method for removing impulse noise images. Вестник Южно-Уральского государственного университета. Серия «Математическое моделирование и программирование». 15, 123–130 (2022)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. NBS Washington, DC (1952)
Jameel, M., Al-Bayati, A., Algamal, Z.: Scaled multi-parameter (SMP) nonlinear QN-algorithms. In: AIP Conference Proceedings. AIP Publishing (2023)
Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J Optim Theory Appl. 69, 129–137 (1991)
Polak, E., Ribiere, G.: Note sur la convergence de méthodes de directions conjuguées. ESAIM: Mathematical Modelling and Numerical Analysis-Modélisation Mathématique et Analyse Numérique. 3, 35–43 (1969)
Salih, Y., Hamoda, M.A., Rivaie, M.: New hybrid conjugate gradient method with global convergence properties for unconstrained optimization. Malaysian Journal of Computing and Applied Mathematics. 1, 29–38 (2018)
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Bilangan : Jurnal Ilmiah Matematika, Kebumian dan Angkasa

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.