A new spectral conjugate gradient method for unconstrained optimization and its application in neural networks
Authors
A. M. Abdulrahman
- College of Science, University of Duhok, Iraq.
B. G. Fathi
- College of Science, University of Zakho, Iraq.
H. Y. Najm
- College of Science, University of Duhok, Iraq.
Abstract
This work introduces a new variation of the Hestenes and Stiefel nonlinear conjugate gradient (HS) method by combining the advantages of the spectral conjugate gradient method and the conjugacy condition of the quasi-Newton method. The proposed method incorporates inexact line searches and categorizing it as a descent method. By employing line searches that satisfy the Wolfe conditions, we establish sufficient descent properties and global convergence condition, assuming that the appropriate conditions are met. Additionally, we perform numerical experiments utilizing benchmark functions frequently used in optimization assignments to evaluate the effectiveness of the proposed method. The results demonstrate that our method outperforms the traditional HS method. Furthermore, we successfully implement the newly developed technique to train neural networks (NNs), demonstrating its practicality for non-traditional optimization tasks.
Share and Cite
ISRP Style
A. M. Abdulrahman, B. G. Fathi, H. Y. Najm, A new spectral conjugate gradient method for unconstrained optimization and its application in neural networks, Journal of Mathematics and Computer Science, 36 (2025), no. 3, 326--332
AMA Style
Abdulrahman A. M., Fathi B. G., Najm H. Y., A new spectral conjugate gradient method for unconstrained optimization and its application in neural networks. J Math Comput SCI-JM. (2025); 36(3):326--332
Chicago/Turabian Style
Abdulrahman, A. M., Fathi, B. G., Najm, H. Y.. "A new spectral conjugate gradient method for unconstrained optimization and its application in neural networks." Journal of Mathematics and Computer Science, 36, no. 3 (2025): 326--332
Keywords
- Unconstrained optimization
- conjugate gradient method
- Wolfe condition
- sufficient descent
- global convergence
- neural networks
MSC
References
-
[1]
N. Andrei, An unconstrained pptimization test functions collection, Adv. Model. Optim., 10 (2008), 147–161
-
[2]
N. Andrei, New Accelerated Conjugate Gradient Algorithms for Unconstrained Optimization, ICI Technical Report, (2008), 1–18
-
[3]
N. Andrei, New accelerated conjugate gradient algorithms as a modification of Dai-Yuan’s computational scheme for unconstrained optimization, J. Comput. Appl. Math., 234 (2010), 3397–3410
-
[4]
Y.-H. Dai, L.-Z. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim., 43 (2001), 87–101
-
[5]
Y. H. Dai, Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim., 10 (1999), 177–182
-
[6]
H. Demuth, M. Beale, Neural network toolbox user’s guide, MathWorks, (2000)
-
[7]
E. D. Dolan, J. J. Mor´, Benchmarking optimization software with performance profiles, Math. Program., 91 (2002), 201–213
-
[8]
R. Fletcher, C. M. Reeves, Function minimization by conjugate gradients, Comput. J., 7 (1964), 149–154
-
[9]
J. C. Glibart, J. Noceded, Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim., 2 (1992), 21–42
-
[10]
S. B. Hanachi, B. Sellami, M. Belloufi, A new family of hybrid conjugate gradient method for unconstrained optimization and its application to regression analysis, RAIRO Oper. Res., 58 (2021), 613–627
-
[11]
M. R. Hestenes, E. Stiefel, Methods of conjugate gradients for solving linear systems, J. Research Nat. Bur. Standards, 49 (1952), 409–436
-
[12]
N. A. Japri, S. Basri, M. Mamat, New modification of the Hestenes-Stiefel with strong wolfe line search, AIP Conf. Proc., 2335 (2021), 1–7
-
[13]
Y. Li, S. Du, Modified HS conjugate gradient method for solving generalized absolute value equations, J. Inequal. Appl., 2019 (2019), 12 pages
-
[14]
B. T. Polyak, The conjugate gradient method in extremal problems, USSR Comput. Math. Math. Phys., 9 (1969), 94–112
-
[15]
E. Polak, G. Ribiere, Note on the convergence of methods of conjugate directions, Rev. Franc¸. Inform. Rech. Op´er., 3 (1969), 35–43
-
[16]
D. E. Rumelhart, J. L. McClelland, PDP Research Group, Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vol. 1: Foundations, MIT Press, Cambridge, MA (1986)
-
[17]
C. Souli, R. Ziadi, A. Bencherif-Madani, H. M. Khudhur, A hybrid CG algorithm for nonlinear unconstrained optimization with application in image restoration, J. Math. Model., 12 (2024), 301–317
-
[18]
G. Wang, R. Shan, W. Huang, W. Liu, J. Zhao, Two new spectral conjugate gradient algorithms based on Hestenes-Stiefel, J. Algorithms Comput. Technol., 11 (2017), 345–352
-
[19]
X. Wu, Y. Zhu, J. Yin, A HS-PRP-type hybrid conjugate gradient method with sufficient descent property, Comput. Intell. Neurosc., 2021 (2021), 8 pages
-
[20]
H. Yabe, M. Takano, Global convergence properties of new nonlinear conjugate gradient methods for unconstrained optimization, Comput. Optim. Appl., 28 (2004), 203–225
-
[21]
R. Ziadi, A. Bencherif-Madani, A perturbed Quasi-Newton algorithm for bound-constrained global optimization, J. Comput. Math., 20 (2023), 1–29
-
[22]
R. Ziadi, A. Bencherif-Madani, A mixed algorithm for smooth global optimization, J. Math. Model., 11 (2023), 207–228
-
[23]
N. Zullpakkal, N. ‘Aini, N. Ghani, N. Mohamed, N. Idalisa, M. Rivaie, Covid-19 data modelling using hybrid conjugate gradient method, J. Inform. Optim. Sci., 4 (2022), 837–853