NLPModels

February 19, 2026 · View on GitHub

DocumentationCICoverageReleaseDOI
docs-stable docs-devbuild-gh build-cirruscodecovreleasedoi

This package provides general guidelines to represent non-linear programming (NLP) problems in Julia and a standardized API to evaluate the functions and their derivatives. The main objective is to be able to rely on that API when designing optimization solvers in Julia.

How to Cite

If you use NLPModels.jl in your work, please cite using the format given in CITATION.cff.

Optimization Problems

Optimization problems are represented by an instance of (a subtype of) AbstractNLPModel. Such instances are composed of

  • an instance of NLPModelMeta, which provides information about the problem, including the number of variables, constraints, bounds on the variables, etc.
  • other data specific to the provenance of the problem.

See the documentation for details on the models and the API.

Installation

pkg> add NLPModels

Models

This package provides no models, although it allows the definition of manually written models.

Check the list of packages that define models in this page of the docs

Main Methods

If model is an instance of an appropriate subtype of AbstractNLPModel, the following methods are normally defined:

  • obj(model, x): evaluate f(x), the objective at x
  • cons(model x): evaluate c(x), the vector of general constraints at x

The following methods are defined if first-order derivatives are available:

  • grad(model, x): evaluate ∇f(x), the objective gradient at x
  • jac(model, x): evaluate J(x), the Jacobian of c at x as a sparse matrix

If Jacobian-vector products can be computed more efficiently than by evaluating the Jacobian explicitly, the following methods may be implemented:

  • jprod(model, x, v): evaluate the result of the matrix-vector product J(x)⋅v
  • jtprod(model, x, u): evaluate the result of the matrix-vector product J(x)ᵀ⋅u

The following method is defined if second-order derivatives are available:

  • hess(model, x, y): evaluate ∇²L(x,y), the Hessian of the Lagrangian at x and y as a sparse matrix

If Hessian-vector products can be computed more efficiently than by evaluating the Hessian explicitly, the following method may be implemented:

  • hprod(model, x, v, y): evaluate the result of the matrix-vector product ∇²L(x,y)⋅v

Several in-place variants of the methods above may also be implemented.

The complete list of methods that an interface may implement can be found in the documentation.

Attributes

NLPModelMeta objects have the following attributes (with S <: AbstractVector):

AttributeTypeNotes
nvarIntnumber of variables
x0 Sinitial guess
lvarSvector of lower bounds
uvarSvector of upper bounds
ifixVector{Int}indices of fixed variables
ilowVector{Int}indices of variables with lower bound only
iuppVector{Int}indices of variables with upper bound only
irngVector{Int}indices of variables with lower and upper bound (range)
ifreeVector{Int}indices of free variables
iinfVector{Int}indices of visibly infeasible bounds
nconInttotal number of general constraints
nlin Intnumber of linear constraints
nnlnIntnumber of nonlinear general constraints
y0 Sinitial Lagrange multipliers
lconSvector of constraint lower bounds
uconSvector of constraint upper bounds
lin Vector{Int}indices of linear constraints
nlnVector{Int}indices of nonlinear constraints
jfixVector{Int}indices of equality constraints
jlowVector{Int}indices of constraints of the form c(x) ≥ cl
juppVector{Int}indices of constraints of the form c(x) ≤ cu
jrngVector{Int}indices of constraints of the form cl ≤ c(x) ≤ cu
jfreeVector{Int}indices of "free" constraints (there shouldn't be any)
jinfVector{Int}indices of the visibly infeasible constraints
nnzoIntnumber of nonzeros in the gradient
nnzjIntnumber of nonzeros in the Jacobian
lin_nnzjIntnumber of nonzeros in the linear constraints Jacobian
nln_nnzjIntnumber of nonzeros in the nonlinear constraints Jacobian
nnzhIntnumber of nonzeros in the lower triangular part of the Hessian of the Lagrangian
minimizeBooltrue if optimize == minimize
islpBooltrue if the problem is a linear program
nameStringproblem name
variable_bounds_analysisBooltrue if the partition of variables into fixed, lower-bounded, upper-bounded, range-bounded, free, and trivially infeasible sets is computed
constraint_bounds_analysisBooltrue if the partition of constraints into equality, lower-bounded, upper-bounded, range-bounded, free, and trivially infeasible sets is computed
sparse_jacobianBooltrue if the Jacobian of the constraints is sparse
sparse_hessianBooltrue if the Hessian of the Lagrangian is sparse
grad_availableBooltrue if the gradient of the objective is available
jac_availableBooltrue if the Jacobian of the constraints is available
hess_availableBooltrue if the Hessian of the Lagrangian is available
jprod_availableBooltrue if the Jacobian-vector product J * v is available
jtprod_availableBooltrue if the transpose Jacobian-vector product J' * v is available
hprod_availableBooltrue if the Hessian-vector product of the Lagrangian H * v is available

Bug reports and discussions

If you think you found a bug, feel free to open an issue. Focused suggestions and requests can also be opened as issues. Before opening a pull request, start an issue or a discussion on the topic, please.

If you want to ask a question not suited for a bug report, feel free to start a discussion here. This forum is for general discussion about this repository and the JuliaSmoothOptimizers, so questions about any of our packages are welcome.