Applying New Preconditioned Conjugated Gradient Algorithms to Unconstrained Optimization Problems

Authors

  • Sabreen M. Abbas University of Mosul
  • Abbas Y. Al-Bayati University of Mosul
  • Maysoon M. Aziz University of Telafer

DOI:

https://doi.org/10.62383/bilangan.v3i1.378

Keywords:

Preconditioned Conjugate Gradient Algorithms, 3 Analysis, Numerical Optimization Techniques

Abstract

In this paper, we study a new and improved preconditioned conjugate gradient (PCG) algorithm based on Dai and Liao's procedure to enhance the CG algorithm of (Maulana). The new PCG algorithm satisfies the coupling condition and the sufficient descent condition. This work proposes improved conjugate gradient methods to enhance the efficiency and robustness of classical conjugate gradient methods. The study changes the diagonal of the inverse Hessian approximation to quasi-Newton Broyden-Fletcher-Goldfarb-Shano (BFGS) updating to make a preconditioner for nonlinear conjugate gradient (NCG) methods used to solve large-scale optimization problems with no constraints. We will calculate the step size of this two-term algorithm by accelerating the Wolfe-Powell line searching technique. The proposed new PCG algorithms have proven their global convergence in certain specific conditions reported in this paper.

Downloads

Download data is not yet available.

References

Andrei, N. (2007). Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim Methods Softw, 22(4), 561–571.

Andrei, N. (2020). Nonlinear conjugate gradient methods for unconstrained optimization. Springer.

Aziz, R. F., Younis, M. S., & Jameel, M. S. (2024). An investigation of a new hybrid conjugate gradient approach for unconstrained optimization problems. Bilangan: Jurnal Ilmiah Matematika, Kebumian dan Angkasa, 2(4), 11–23.

Dai, Y.-H., & Liao, L.-Z. (2001). New conjugacy conditions and related nonlinear conjugate gradient methods. Applied Mathematics and Optimization, 43, 87–101.

Dener, A., Denchfield, A., & Munson, T. (2019). Preconditioning nonlinear conjugate gradient with diagonalized quasi-Newton. In Proceedings of the Platform for Advanced Scientific Computing Conference (pp. 1–7).

Hager, W. W., & Zhang, H. (2006). A survey of nonlinear conjugate gradient methods. Pacific Journal of Optimization, 2(1), 35–58.

Jameel, M. S., Abdullah, Z. M., Fawzi, F. A., & Hassan, B. A. (2021). A new shifted conjugate gradient method based on shifted quasi-Newton condition. Journal of Physics: Conference Series, IOP Publishing, 12105.

Jameel, M. S., Basheer, G. T., Al-Bayati, A. Y., & Algamal, Z. Y. (2021). Parameter estimation of a truncated regression model based on improving numerical optimization algorithms. Journal of Physics: Conference Series, IOP Publishing, 012059.

Jameel, M., Al-Bayati, A., & Algamal, Z. (2023). Scaled multi-parameter (SMP) nonlinear QN-algorithms. AIP Conference Proceedings, AIP Publishing.

Knyazev, A. V., & Lashuk, I. (2008). Steepest descent and conjugate gradient methods with variable preconditioning. SIAM Journal on Matrix Analysis and Applications, 29(4), 1267–1280.

Stiefel, E. L. (1958). Kernel polynomial in linear algebra and their numerical applications. In Further contributions to the determination of eigenvalues (NBS Applied Math. Ser., vol. 49, pp. 1–22).

Zoutendijk, G. (1970). Nonlinear programming, computational methods. In Integer and nonlinear programming (pp. 37–86).

Downloads

Published

2025-01-09

How to Cite

Sabreen M. Abbas, Abbas Y. Al-Bayati, & Maysoon M. Aziz. (2025). Applying New Preconditioned Conjugated Gradient Algorithms to Unconstrained Optimization Problems. Bilangan : Jurnal Ilmiah Matematika, Kebumian Dan Angkasa, 3(1), 60–78. https://doi.org/10.62383/bilangan.v3i1.378

Similar Articles

<< < 1 2 3 4 5 

You may also start an advanced similarity search for this article.