mklJac

Estimate the gradient or Jacobian of a function via numerical differences

Syntax

jac = mklJac(fun,x)

jac = mklJac(fun,x,nrow)

jac = mklJac(fun,x,nrow,tol)

[jac,status] = mklJac(fun,x,nrow,tol)

Description

This function uses the Intel MKL[1] method djacobi to estimate the gradient or Jacobian at a value of x using central differences. By default if you do not supply the gradient / Jacobian to a nonlinear optimizer that requires it, this method will be used.

jac = mklJac(fun,x) uses the function handle fun and the current state vector x to estimate the gradient of the function.

jac = mklJac(fun,x,nrow) specifies the number of rows in the vector returned from fun, used to determine the return size of the Jacobian. If the number of rows is not specified then a function call is used in order to determine the number of rows.

jac = mklJac(fun,x,nrow,tol) specifies the tolerance of the numerical difference algorithm across all variables. By default tol is 1e-6.

[jac,status] = mklJac(fun,x,nrow,tol) also returns 1 if the algorithm was successful, or 0 if it failed.

Example

Estimate the Jacobian of the following nonlinear constraint function at x = [1 1 1 1]T:

nlcon = @(x) [8 - x(1)^2 - x(2)^2 - x(3)^2 - x(4)^2 - x(1) + x(2) - x(3) + x(4);
              10 - x(1)^2 - 2*x(2)^2 - x(3)^2 - 2*x(4)^2 + x(1) + x(4);
              5 - 2*x(1)^2 - x(2)^2 - x(3)^2 - 2*x(1) + x(2) + x(4)];

x = [1;1;1;1];

jac = mklJac(nlcon,x,3)

  

jac =

-3.0000 -1.0000 -3.0000 -1.0000
-1.0000 -4.0000 -2.0000 -3.0000
-6.0000 -1.0000 -2.0000  1.0000