Mathematical models can be solved faster using the new NAG SOCP solver

New: Second-order Cone Programming, Derivative-free Optimization, First-order Active-set method plus Non-negative Matrix Factorization, Nearest Correlation Matrix additions and more adjoints of NAG solvers

July 25, 2019 – The Numerical Algorithms Group (NAG), experts in algorithms, software and HPC, announces the latest Mark of its flagship software, the NAG Library. At Mark 27, NAG have introduced new mathematical optimization solvers for Second-order Cone Programming, Derivative-free Optimization and First-order Active-set method. In addition to the new optimization solvers, the NAG Library Mark 27 features new routines for Nearest Correlation Matrix, Non-negative Matrix Factorization as well as more adjoints of NAG Library functions.

New Optimization Solvers in the NAG Optimization Modelling Suite

Included in the NAG Library Optimization Modelling Suite is a new Second-order Cone Programming (SOCP) solver based on interior point method. It has become an important tool in many fields, ranging from engineering to control theory and quantitative finance, due to the wide range of applications and convex problems that it can solve, such as quadratically constrained quadratic problems (QCQP), robust linear programming and many others.


Image shows feasible region of an SOCP problem with 3 variables


A new set of Derivative-free Optimization solvers has been integrated into the Library at Mark 27. They are aimed at optimizing Black Box models and can handle either calibration (nonlinear least squares) problems or problems with a generic objective function. The solvers, available with both direct and reverse communication interfaces should show an improved convergence rate compared to the existing DFO solutions in the NAG Library. 

The third new optimization solver to be introduced by NAG at Mark 27 is a First-order Active-set (FOAS) method for bound-constrained large-scale nonlinear programming problems. The new solver is based on nonlinear conjugate gradient method, it only requires first derivatives and has low memory requirements thus is suitable for very large-scale problems. 

To make these new solvers accessible, they are provided through the NAG Optimization Modelling Suite. This allows users to build up their problem in stages, instead of calling one monolithic solver with many arguments; making it simpler to use and easier to avoid mistakes.


The new solvers add to NAG’s existing Optimization routines featured in the Library which cover all the following areas:

  • Unconstrained and constrained nonlinear programming
  • Nonlinear least squares, data fitting
  • Linear and quadratic programming
  • Semidefinite programming
  • Derivative free optimization
  • Global optimization
  • Mixed-integer nonlinear optimization


Other additions in the NAG Mark 27 Library

  • Non-negative Matrix Factorization
  • Adjoint NAG Library routines
  • Randomized Numerical Linear Algebra
  • Nearest Correlation Matrix Functions
  • Interpolation routines
  • Updated Mixed Effects Regression
  • Least Squares and Eigenvalue Problems (LAPACK)

The NAG Library is expertly developed, maintained, documented and supported. NAG’ team is proud that much of the new content is added in direct response to user requests, including those from major financial institutions, market intelligence companies as well as academic and research institutes.

The inherent flexibility of the mathematical and statistical functions in the NAG Library enable it to be used across multiple programming languages, environments and operating systems including C and C++, Python, Excel, Java, .NET, MATLAB, Visual Basic, Fortran and many more.

The NAG Library includes algorithms that have been contributed or developed on a collaborative basis. We extend thanks to Mark 27 contributors and collaborators, Coralia Cartis and Lindon Roberts, The University of Oxford, Hou-Duo Qi, Defeng Sun and Nicholas J. Higham, The University of Manchester, Uwe Naumann, Johannes Lotz and Klaus Leppkes, RWTH Aachen, and Fabio Cannizo.