A Comparison of Nonlinear Optimization Methods for Rosenbrock, Booth, and Matyas Functions
DOI:
https://doi.org/10.47611/jsrhs.v12i4.5236Keywords:
Booth Function, Matyas Function, nonlinear optimization, quasi-Newton method, Rosenbrock Function, Simplex MethodAbstract
There is increasing interest in the application of nonlinear optimization techniques across a broad range of real-world problems that is linked to rapid advances in novel technologies and renewed interest in areas like sustainable development. While many algorithms for nonlinear optimization have been developed, they differ widely in their methods of convergence to the solution. Consequently, matching the function to be optimized with the right optimization algorithm is essential. To further explore this relationship, we studied two nonlinear optimization methods – the Nelder-Mead simplex method that only performs function evaluations and the quasi-Newton method which requires estimation of derivatives. These algorithms were used to find the global minima of the Rosenbrock, Booth, and Matyas functions from multiple starting points. The convergence paths for each starting point across both optimization methods for all 3 functions were visualized on contour plots. While convergence on the global minima was observed in all instances, our analysis indicated that the quasi-Newton method was consistently more efficient and needed fewer iterations than the Simplex method. This was especially pronounced for the Booth and Matyas functions where ~10-fold and 20-fold fewer iterations, respectively, were necessary for convergence. Our analysis reinforces the need to carefully match properties of the function being minimized with the performance characteristics of the optimization approach to obtain fast convergence on the global minimum.
Downloads
References or Bibliography
Bartholomew-Biggs, M. (2005). Nonlinear Optimization with Financial Applications. Springer. https://doi.org/10.1007/b102601
Bartholomew-Biggs, M. (2008). Nonlinear Optimization with Engineering Applications. Springer.
Beck, A. (2014). Introduction to Nonlinear Optimization: Theory, Algorithms, and Applications with MATLAB. SIAM-Society for Industrial and Applied Mathematics.
Cui, Y., Geng, Z., Zhu, Q., & Han, Y. (2017). Review: Multi-objective optimization methods and application in energy saving. Energy, 125, 681–704. https://doi.org/10.1016/j.energy.2017.02.174
Edgar, T. F., & Himmelblau, D. M. (1987). Optimization of Chemical Processes. McGraw-Hill College.
Nelder, J. A., & Mead, R. (1965). A Simplex Method for Function Minimization. The Computer Journal, 7(4), 308–313. https://doi.org/10.1093/comjnl/7.4.308
Rosenbrock, H. H. (1960). An Automatic Method for Finding the Greatest or Least Value of a Function. The Computer Journal, 3(3), 175–184. https://doi.org/10.1093/comjnl/3.3.175
Salleh, Z., Almarashi, A., & Alhawarat, A. (2022). Two efficient modifications of AZPRP conjugate gradient method with sufficient descent property. Journal of Inequalities and Applications, 2022. https://doi.org/10.1186/s13660-021-02746-0
Wang, Z., & Luo, Q. (2021). Hybrid metaheuristic algorithm using butterfly and flower pollination base on mutualism mechanism for global optimization problems. Engineering with Computers, 37. https://doi.org/10.1007/s00366-020-01025-8
Published
How to Cite
Issue
Section
Copyright (c) 2023 Niah Goudar; Kristen Skaff
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Copyright holder(s) granted JSR a perpetual, non-exclusive license to distriute & display this article.