Java Optimization Library

DasOptimization (V1.0) is a lightweight, robust and scalable library capable of solving a system of nonlinear equations and nonlinear unconstrained optimization. Some of the capabilities are:

  • DasOptimization implemented two descent algorithms; Line Search and Trust-Region-Dogleg algorithm.
  • Analytical gradient, the Jacobian or the Hessian can be supplied but if they are not available, they will be calculated with finite differences.
  • DasOptimization can recover from singular Hessian and Jacobian matrices.
  • DasOptimization is coded only in Java and therefore, it can be used in Android projects.

EJML (Efficient Java Matrix Library) is used for matrix calculations. You need to include or build ejml .jar files in your project library before running any DasOptimization code.

DasOptimization (V2.0) will include a nonlinear constrained optimizer capable of handling nonlinear constraints.

Solving System of Nonlinear Equations

You need to use NonlinearEquationSolver class to solve a system of nonlinear equations. Before creating a NonlinearEquationSolver object, you need to define your set of equations and the Jacobian (if available). You also need to create an Options object to specify user defined options for the solver.

Options Object

Create an Options object as:

Options options = new Options(n);

where is the number of variables.

This will be enough for most applications. However, you can fully customize solver preferences and here are some of the important parameters:

options.setAnalyticalJacobian(true); //specify if you will supply the analytical Jacobian (default:false)
options.setAlgorithm(Options.TRUST_REGION); //set the algorithm; Options.TRUST_REGION or Options.LINE_SEARCH (default: Options.TRUST_REGION)
options.setSaveIterationDetails(true);//save iteration details to a Results object (default:false)
options.setAllTolerances(1e-12);//set convergence tolerances (default:1e-8)
options.setMaxIterations(1000);//set maximum number of iterations (default:100)

ObjectiveFunctionNonLinear Object

In order to define a set of nonlinear equations that you want to solve, you need to implement the ObjectiveFunctionNonLinear interface.  You also have to define the Jacobian matrix if you had set the analytical Jacobian flag at the Options object.

ObjectiveFunctionNonLinear f = new ObjectiveFunctionNonLinear() {
@Override
public DMatrixRMaj getF(DMatrixRMaj x) {
DMatrixRMaj f = new DMatrixRMaj(2, 1);
f.set(0, 0, 10 * (x.get(1, 0) - x.get(0, 0) * x.get(0, 0)));
f.set(1, 0, 1 - x.get(0, 0));
return f;
}

@Override
public DMatrixRMaj getJ(DMatrixRMaj x) {
DMatrixRMaj J = new DMatrixRMaj(numberOfVariables, numberOfVariables);
J.set(0,0, -20 * x.get(0);
J.set(0,1, 10);
J.set(1, 0, -1);
return J;
}

If you do not want to define the Jacobian and want to use finite difference Jacobian, just return null in getJ method.

Solving System of Nonlinear Equations

Last step before solving the set of nonlinear equations is defining an initial guess vector:

DMatrixRMaj initialGuess = new DMatrixRMaj(2, 1);
initialGuess.set(0,0, -1.2 );
initialGuess.set(1, 0, 1.0);
}

Now, we are ready to create an NonlinearEquationSolver object using previously defined option and equation definitions:

NonlinearEquationSolver nonlinearSolver = new NonlinearEquationSolver(f, options);

Finally, solve the system of nonlinear equations at the initial guess:

nonlinearSolver.solve(new DMatrixRMaj(initialGuess));

Nonlinear Unconstrained Optimization

You need to use UnconstrainedOptimizer class for solving a nonlinear unconstrained optimization problem. Before creating a UnconstrainedOptimizer object, you need to define the objective function, gradient vector (if available) and the Hessian matrix (if available). You also need to create an Options object to specify user defined options for the nonlinear unconstrained optimization solver.

Options Object

Create an Options object as:

Options options = new Options(n);

where is the number of variables.

Options for the nonlinear unconstrained optimization include specifying if analytical gradient and/or Hessian are implemented:

options.setAnalyticalGradient(true); //specify if you will supply the analytical gradient (default:false)
options.setAnalyticalHessian(true); //specify if you will supply the analytical Hessian (default:false)
options.setAlgorithm(Options.TRUST_REGION); //set the algorithm; Options.TRUST_REGION or Options.LINE_SEARCH (default: Options.TRUST_REGION)
options.setSaveIterationDetails(true);//save iteration details to a Results object (default:false)
options.setAllTolerances(1e-12);//set convergence tolerances (default:1e-8)
options.setMaxIterations(1000);//set maximum number of iterations (default:100)

ObjectiveFunctionUnconstrained Object

In order to define a set of nonlinear equations that you want to solve, you need to implement the ObjectiveFunctionNonLinear interface.  You also have to define the gradient vector and/or the Hessian matrix if you set the appropriate flags in the Options object.

ObjectiveFunctionUnconstrained function = new ObjectiveFunctionUnconstrained() {
@Override
public double getF(DMatrixRMaj x) {
double x1 = x.get(0);
double x2 = x.get(1);
return (1.5 - x1 + x1 * x2) * (1.5 - x1 + x1 * x2) + (2.25 - x1 + x1 * x2 * x2) * (2.25 - x1 + x1 * x2 * x2) + (2.625 - x1 + x1 * x2 * x2 * x2) * (2.625 - x1 + x1 * x2 * x2 * x2);
}

@Override
public DMatrixRMaj getG(DMatrixRMaj x) {
double x1 = x.get(0);
double x2 = x.get(1);
DMatrixRMaj g = new DMatrixRMaj(2, 1);
g.set(0, 0, 2.0 * (x2 * x2 - 1.0) * (x1 * x2 * x2 - x1 + 9.0 / 4.0) + 2.0 * (x2 * x2 * x2 - 1) * (x1 * x2 * x2 * x2 - x1 + 21.0 / 8.0) + 2 * (x2 - 1) * (x1 * x2 - x1 + 3.0 / 2.0));
g.set(1, 0, 2.0 * x1 * (x1 * x2 - x1 + 3.0 / 2.0) + 4.0 * x1 * x2 * (x1 * x2 * x2 - x1 + 9.0 / 4.0) + 6 * x1 * x2 * x2 * (x1 * x2 * x2 * x2 - x1 + 21.0 / 8.0));
return g;
}

@Override
public DMatrixRMaj getH(DMatrixRMaj x) {
double x1 = x.get(0);
double x2 = x.get(1);
DMatrixRMaj h = new DMatrixRMaj(2, 2);
h.set(0, 0, 2 * (x2 - 1.0) * (x2 - 1.0) + 2.0 * (x2 * x2 - 1.0) * (x2 * x2 - 1.0) + 2.0 * (x2 * x2 * x2 - 1.0) * (x2 * x2 * x2 - 1.0));
h.set(0, 1, 9.0 * x2 - 4.0 * x1 - 4.0 * x1 * x2 - 12.0 * x1 * x2 * x2 + 8.0 * x1 * x2 * x2 * x2 + 12 * x1 * x2 * x2 * x2 * x2 * x2 + (63.0 * x2 * x2) / 4.0 + 3.0);
h.set(1, 0, 9.0 * x2 - 4.0 * x1 - 4.0 * x1 * x2 - 12.0 * x1 * x2 * x2 + 8.0 * x1 * x2 * x2 * x2 + 12.0 * x1 * x2 * x2 * x2 * x2 * x2 + (63.0 * x2 * x2) / 4.0 + 3.0);
h.set(1, 1, (x1 * (63.0 * x2 - 4.0 * x1 - 24.0 * x1 * x2 + 24.0 * x1 * x2 * x2 + 60.0 * x1 * x2 * x2 * x2 * x2 + 18.0)) / 2.0);
return h;
}
};

Return null in the gradient and/or the Hessian function if you want to use finite element difference approximations.

Unconstrained Nonlinear Optimization

Last step before solving the set of nonlinear equations is defining an initial guess vector:

DMatrixRMaj initialGuess = new DMatrixRMaj(2, 1);
initialGuess.set(0, 0, -4.5);
initialGuess.set(1, 0, -4.5);
}

Now, we are ready to create an UnconstrainedOptimizer object using previously defined option and equation definitions:

UnconstrainedOptimizer unconstrainedSolver= new UnconstrainedOptimizer(f, options);

Finally, solve the nonlinear unconstrained optimization problem:

unconstrainedSolver.solve(new DMatrixRMaj(initialGuess));

Post-Processing

After calling unconstrainedSolver.solve(initialGuess) or nonlinearSolver.solve(initialGuess), you can access the solution details by:

solver.getX(); //return final x values
solver.getFx(); //return final function values
solver.getJacobian(); //return final Jacobian matrix (only in nonlinear equations solver)
solver.getGx(); // return final gradient vector (only in nonlinear unconstrained optimization)
solver.getHx(); // return final Hessian matrix (only in nonlinear unconstrained optimization)
solver.getTerminationString(); // return the convergence (or failure) details

Results Object

You can set to save the iteration details (minimal effect on performance) if you set:

options.setSaveIterationDetails(true);

Iteration details can be obtained by:

Results results=solver.getResults(); //get results from the solver
solver.getX(); //get list of x values
solver.getFunctionNorm(); //get list of gradient vectors
solver.getFunctionEvaluations(); //get number of function evaluations

System of Nonlinear Equations Solver Tests

NonlinearTest class under the test package contains a number of complicated test problems. The general structure of the test problems are:

NonlinearTest.test(int numberOfVariables, int solver, boolean analyticalJacobian);

where numberOfVariables are number of equations, solver is either Options.TRUST_REGION or Options.LINE_SEARCH and analyticalJacobian should be set to true if analytical Jacobian is desired for the problem.

The list of problems are:

NonlinearTest.extendedRosenbrockFunction(int numberOfVariables, int solver, boolean analyticalJacobian) //numberOfVariables must be multiple of two
NonlinearTest.powellSingularFunction(int numberOfVariables, int solver, boolean analyticalJacobian) //numberOfVariables must be multiple of four, analytical Jacobian not available
NonlinearTest.trigonometricFunction(int numberOfVariables, int solver, boolean analyticalJacobian) //analytical Jacobian not available
NonlinearTest.helicalValleyFunction(int numberOfVariables, int solver, boolean analyticalJacobian) //numberOfVariables does not have any affect, analytical Jacobian not available

Nonlinear Unconstrained Optimization Tests

UnconstrainedTest class under the test package contains a number of complicated test problems. The general structure of the test problems are:

NonlinearTest.test(int algorithm, boolean analyticalGradient, boolean analyticalHessian);

where algorithm is either Options.TRUST_REGION or Options.LINE_SEARCH, analyticalGradient and analyticalHessian should be set to true if analytical gradient and/or Hessian is desired for the problem.

The list of problems are:

UnconstrainedTest.bealeFunction(int algorithm, boolean analyticalGradient, boolean analyticalHessian)
UnconstrainedTest.helicalValleyFunction(int algorithm, boolean analyticalGradient, boolean analyticalHessian) //analytical gradient and Hessian are not available
UnconstrainedTest.woodFunction(int algorithm, boolean analyticalGradient, boolean analyticalHessian)
UnconstrainedTest.rosenbrockFunction(int algorithm, boolean analyticalGradient, boolean analyticalHessian)
UnconstrainedTest.powellSingularFunction(int algorithm, boolean analyticalGradient, boolean analyticalHessian)