An improved accelerated 3-term conjugate gradient algorithm with second-order Hessian approximation for nonlinear least-squares optimization
Authors
R. B. Yunus
- Department of Fundamental and Applied Sciences, Faculty of Science and Information Technology, Universiti Teknologi PETRONAS, Bandar Seri Iskandar 32610, Perak Darul Ridzuan, Malaysia.
N. Zainuddin
- Department of Fundamental and Applied Sciences, Faculty of Science and Information Technology, Universiti Teknologi PETRONAS, Bandar Seri Iskandar 32610, Perak Darul Ridzuan, Malaysia.
H. Daud
- Department of Fundamental and Applied Sciences, Faculty of Science and Information Technology, Universiti Teknologi PETRONAS, Bandar Seri Iskandar 32610, Perak Darul Ridzuan, Malaysia.
R. Kannan
- Department of Electrical and Electronics Engineering, Universiti Teknologi PETRONAS, Bandar Seri Iskandar 32610, Perak Darul Ridzuan, Malaysia.
M. M. Yahaya
- Department of Mathematics, Faculty of Science, King Mongkut’s University of Technology Thonburi (KMUTT), 126 Pracha-Uthit Road, Bang Mod, Thung Khru, Bangkok 10140, Thailand.
A. Al-Yaari
- Department of Fundamental and Applied Sciences, Faculty of Science and Information Technology, Universiti Teknologi PETRONAS, Bandar Seri Iskandar 32610, Perak Darul Ridzuan, Malaysia.
Abstract
Nonlinear least-squares (NLS) problems find extensive applications across various fields within the applied sciences. Conventional methods for solving NLS problems often face challenges related to computational efficiency and memory requirements, especially when dealing with large-scale systems. In this paper, the solution to the minimization of nonlinear least squares problems has been obtained using a proposed structured accelerated three-term conjugate gradient method, in which from Taylor series approximations of the objective function's Hessian, the structured vector approximation involving a vector's action on a matrix is obtained. This ensures the satisfaction of a quasi-Newton condition. The technique then employs the structured vector approximation to incorporate additional information from the Hessian of the goal function into the standardized search direction. The proposed method's search direction fulfills the necessary descent criterion. Additionally, numerical tests performed on various test problems show that the suggested approach is remarkably efficient, surpassing some existing competitors.
Share and Cite
ISRP Style
R. B. Yunus, N. Zainuddin, H. Daud, R. Kannan, M. M. Yahaya, A. Al-Yaari, An improved accelerated 3-term conjugate gradient algorithm with second-order Hessian approximation for nonlinear least-squares optimization, Journal of Mathematics and Computer Science, 36 (2025), no. 3, 263--274
AMA Style
Yunus R. B., Zainuddin N., Daud H., Kannan R., Yahaya M. M., Al-Yaari A., An improved accelerated 3-term conjugate gradient algorithm with second-order Hessian approximation for nonlinear least-squares optimization. J Math Comput SCI-JM. (2025); 36(3):263--274
Chicago/Turabian Style
Yunus, R. B., Zainuddin, N., Daud, H., Kannan, R., Yahaya, M. M., Al-Yaari, A.. "An improved accelerated 3-term conjugate gradient algorithm with second-order Hessian approximation for nonlinear least-squares optimization." Journal of Mathematics and Computer Science, 36, no. 3 (2025): 263--274
Keywords
- Three-term
- nonlinear
- least-squares
- accelerated
- conjugate gradient
MSC
References
-
[1]
M. Al-Baali, R. Fletcher, Variational methods for non-linear least-squares, J. Oper. Res. Soc., 36 (1985), 405–421
-
[2]
N. Andrei, An acceleration of gradient descent algorithm with backtracking for unconstrained optimization, Numer. Algorithms, 42 (2006), 63–73
-
[3]
N. Andrei, A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Appl. Math. Lett., 20 (2007), 645–650
-
[4]
J. Barzilai, J. M. Borwein, Two-point step size gradient methods, IMA J. Numer. Anal., 18 (1988), 141–148
-
[5]
Y.-H. Dai, L.-Z. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim., 43 (2011), 87–101
-
[6]
R. Dehghani, N. Mahdavi-Amiri, Scaled nonlinear conjugate gradient methods for nonlinear least squares problems, Numer. Algorithms, 82 (2019), 1–20
-
[7]
J. E. Dennis, Jr., H. J. Mart´ınez, R. A. Tapia, Convergence theory for the structured BFGS secant method with an application to nonlinear least squares, J. Optim. Theory Appl., 61 (1989), 161–178
-
[8]
T. Diphofu, P. Kaelo, A. R. Tufa, A convergent hybrid three-term conjugate gradient method with sufficient descent property for unconstrained optimization, Topol. Algebra Appl., 10 (2022), 47–60
-
[9]
E. D. Dolan, J. J. Mor´e, Benchmarking optimization software with performance profiles, Math. Program., 91 (2002), 201–213
-
[10]
W. W. Hager, H. Zhang, A survey of nonlinear conjugate gradient methods, Pac. J. Optim., 2 (2006), 35–58
-
[11]
M. R. Hestenes, E. Stiefel, Methods of conjugate gradients for solving linear systems, J. Research Nat. Bur. Standards, 49 (1952), 409–436
-
[12]
J. Iwaniec, T. Uhl, The application of the nonlinear least squares frequency domain method to estimation of the modal model parameters, Machine Dynamics Problems, 27 (2003), 37–54
-
[13]
M. Jamil, X.-S. Yang, A literature survey of benchmark functions for global optimisation problems, Int. J. Math. Model. Numer. Optim., 4 (2013), 150–194
-
[14]
K. Kamfa, S. Ibrahim, S. F. Sufahani, R. B. Yunus, M. Mamat, A modified bfgs method via new rational approximation model for solving unconstrained optimization problems and its application, Adv. Math.: Sci. J., 9 (2020), 10771–10786
-
[15]
M. Kobayashi, Y. Narushima, H. Yabe, Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems, J. Comput. Appl. Math., 234 (2010), 375–397
-
[16]
W. La Cruz, J. Mart´ınez, M. Raydan, Spectral residual method without gradient information for solving large-scale nonlinear systems of equations, Math. Comput., 75 (2006), 1429–1448
-
[17]
J. K. Liu, S. J. Li, A projection method for convex constrained monotone nonlinear equations with applications, Comput. Math. Appl., 70 (2015), 2442–2453
-
[18]
Y. Liu, C. Storey, Efficient generalized conjugate gradient algorithms. I. Theory, J. Optim. Theory Appl., 69 (1991), 129–137
-
[19]
L. Lukˇsan, J. Vlcek, Test problems for unconstrained optimization, Tech. Rep., Acad. Sci. Czech Republic, Inst. Comput. Sci., (2003)
-
[20]
H. Mohammad, S. A. Santos, A structured diagonal Hessian approximation method with evaluation complexity analysis for nonlinear least squares, Comput. Appl. Math., 37 (2018), 6619–6653
-
[21]
J. J. Mor´e, B. S. Garbow, K. E. Hillstrom, Testing unconstrained optimization software, ACM Trans. Math. Software, 7 (1981), 17–41
-
[22]
Y. Narushima, H. Yabe, J. A. Ford, A three-term conjugate gradient method with sufficient descent property for unconstrained optimization, SIAM J. Optim., 21 (2011), 212–230
-
[23]
E. Polak, G. Ribi`ere, Note sur la convergence de m´ethodes de directions conjugu´ees, Rev. Franc¸aise Informat. Recherche Op´erationnelle, 3 (1969), 35–43
-
[24]
B. T. Polyak, The conjugate gradient method in extreme problems, USSR Comput. Math. Math. Phys., 9 (1969), 94–112
-
[25]
N. Salihu, P. Kumam, A. M. Awwal, I. Arzuka, T. Seangwattana, A structured Fletcher-Revees spectral conjugate gradient method for unconstrained optimization with application in robotic model, Oper. Res. Forum, 4 (2023), 25 pages
-
[26]
N. Salihu, P. Kumam, A. M. Awwal, I. M. Sulaiman, T. Seangwattana, The global convergence of spectral RMIL conjugate gradient method for unconstrained optimization with applications to robotic model and image recovery, Plos One, 18 (2023), 1–19
-
[27]
N. Salihu, P. Kumam, I. M. Sulaiman, T. Seangwattana, An efficient spectral minimization of the Dai-Yuan method with application to image reconstruction, AIMS Math., 8 (2023), 30940–30962
-
[28]
K. Sugiki, Y. Narushima, H. Yabe, Globally Convergent Three-Term Conjugate Gradient Methods that Use Secant Conditions and Generate Descent Search Directions for Unconstrained Optimization, J. Optim. Theory Appl., 153 (2012), 733–757
-
[29]
I. M. Sulaiman, M. Malik, A. M. Awwal, P. Kumam, M. Mamat, S. Al-Ahmad, On three-term conjugate gradient method for optimization problems with applications on COVID-19 model and robotic motion control, Adv. Contin. Discrete Models, 2022 (2022), 22 pages
-
[30]
C. Tang, S. Li, Z. Cui, Least-squares-based three-term conjugate gradient methods, J. Inequal. Appl., 2020 (2020), 22 pages
-
[31]
F. Wang, D.-H. Li, L. Qi, Global convergence of Gauss-Newton-MBFGS method for solving the nonlinear least squares problem, Adv. Model. Optim., 12 (2010), 1–19
-
[32]
M. M. Yahaya, P. Kumam, A. M. Awwal, S. Aji, A structured quasi-Newton algorithm with nonmonotone search strategy for structured NLS problems and its application in robotic motion control, J. Comput. Appl. Math., 395 (2021), 17 pages
-
[33]
M. M. Yahaya, P. Kumam, P. Chaipunya, T. Seangwattana, Structured adaptive spectral-based algorithms for nonlinear least squares problems with robotic arm modelling applications, Comput. Appl. Math., 42 (2023), 31 pages
-
[34]
R. B. Yunus, K. Kamfa, S. I. Mohammed, M. Mamat, A Novel Three Term Conjugate Gradient Method for Unconstrained Optimization via Shifted Variable Metric Approach with Application, In: Intelligent Systems Modeling and Simulation II, Studies in Systems, Decision and Control, Springer, Cham, 444 (2022), 581–596
-
[35]
R. B. Yunus, M. Mamat, A. Abashar, M. Rivaie, Z. Salleh, Z. A. Zakaria, The convergence properties of a new kind of conjugate gradient method for unconstrained optimization, Appl. Math. Sci., 9 (2015), 1845–1856
-
[36]
R. B. Yunus, N. Zainuddin, H. Daud, R. Kannan, S. A. A. Karim, M. M. Yahaya, A modified structured spectral HS method for nonlinear least squares problems and applications in robot arm control, Mathematics, 11 (2023), 1–17
-
[37]
R. B. Yunus, N. Zainuddin, H. Daud, R. Kannan, M. M. Yahaya, S. A. A. Karim, New CG-Based Algorithms With Second-Order Curvature Information for NLS Problems and a 4DOF Arm Robot Model, IEEE Access, 12 (2024), 61086– 61103
-
[38]
H. Zhang, W. W. Hager, A nonmonotone line search technique and its application to unconstrained optimization, SIAM J. Optim., 14 (2004), 1043–1056