Calibrating the parameters of complex numerical models to fit real-world observations is one of the most common problems found in industries such as finance, physics, simulations, engineering, etc. NAG introduces at Mark 27.1 of the NAG Library a novel nonlinear least squares (NLN-LSQ) trust-region solver handle_solve_bxnl (e04gg) for unconstrained and bound-constrained fitting problems which implements various algorithms and regularization techniques. It is aimed at small to medium-sized fitting problems (up to 1000s of parameters) of the form
where the idea is to find optimal values for the parameters, \(x\), of the model represented by the smooth function \(ϕ(t,x)\) (blue line) which fit \(m\) observed data points \((t_{i},y_{i})\) (black dots). That is, to minimize the squared error between the model and the data (vertical red bars).
e04gg should present a significant improvement over the current nonlinear least squares solvers in the NAG Library. In addition, this solver fills the gap between unconstrained solvers such as lsq_uncon_quasi_deriv_comp (e04gb) and the fully constrained ones, such as lsq_gencon_deriv (e04us).
e04gg is part of the NAG Optimization Modelling Suite which offers clarity and consistency on the interface of the solvers within the suite.
The new solver stems from a collaboration with the Rutherford Appleton Laboratory [1] and demonstrates NAG’s ongoing effort to expand and improve its offering in mathematical optimization.
The new solver e04gg offers unprecedented robustness and a significant speed-up over current alternatives in the Library, namely e04gb for unconstrained nonlinear least squares problems and e04us for problems with simple variable bounds. You are highly recommended to upgrade to the new solver.
Our benchmarks comparing e04gg to e04gb on 68 unconstrained nonlinear least squares CUTEst problems are reported in Figure 1 (a)–(c) using performance profiles. The three plots show that the new solver is faster: solving 60% of the problems in less time (a) and is more robust, solving 25% more problems. Requires fewer user callbacks: 55% of problems require fewer function calls (b) and 65% require fewer gradient evaluations (c).
As the e04gb solver was not designed to tackle simple bounds directly, typically e04us would be used instead for such problems. However, e04us being a more general solver does not fully exploit the structure of NLN-LSQ problems as e04gg does in the presence of simple bounds. That’s why a speed-up on 45% of problems can be observed (d) as well as a reduction in user callbacks: 65% of problems require fewer function and gradient calls (e and f) when comparing e04us and e04gg on 112 unconstrained and bound-constrained nonlinear least square CUTEst problems.
(a) Time
(b) Function calls
(c) Gradient calls
(d) Time
(e) Function calls
(f) Gradient calls
Figure 1: (a)–(c) Performance profiles comparing solvers e04gg and e04gb over 68 CUTEst unconstrained nonlinear least squares problems, while (d)–(f) report the performance profiles of e04gg and e04us for 112 CUTEst unconstrained and bound constrained nonlinear least squares problems. Performance measures are time in seconds (a and d), number of function calls (b and e), and number of gradient calls (c and f). For the time plots, the higher line indicates the faster solver. For the centre and right column plots, the higher line represents fewer functions and gradients calls respectively.
The following example illustrates the usage of e04gg to fit PADC etched nuclear track data to a convoluted distribution. A target sheet is scanned and track diameters are recorded (red wedges in left image of Figure 2) into a histogram (blue bars in right plot of Figure 2) and a mixed Normal and log-Normal model is to be fitted to the obtained experimental histogram. e04gg is used to fit the six parameter model
using the histogram heights. Thanks to the use of regularization and residual weights, e04gg provided a robust solution \(x^*\) to unfold the parameters for the two distributions (red and blue curves in right plot of Figure 2). Adding these together produce the green curve which is the one used to perform the fitting. Data and complete python source code for this example are available here.
For further examples and reading visit our GitHub Local Optimization page.
This will close in 20 seconds