Skip to main content

trust-constr optimization algorithm from the SciPy project.

Project description

trust-constr

trust-constr optimization algorithm from the SciPy project that was originally implemented by Antonio Horta Ribeiro. This is a version of the trust-constr algorithm that does not depend on the rest of SciPy. The only dependency is NumPy. The goal is to have a version of the trust-constr algorithm that can run within the Pyodide environment.

Examples Using trust-constr

Since the trust-constr algorithm was extracted from the scipy.optimize library, it uses the same interface as scipy.optimize.minimize. The main different is that everything is imported from trust_constr rather than from scipy.optimize. The other difference is that the only optimization method available is 'trust-const'. The examples below show how to use trust-constr with a variety of different types of constraints.

import numpy as np
from trust_constr import minimize, NonlinearConstraint, LinearConstraint, Bounds, check_grad

Example 1: Nonlinear Inequality Constraint with Variable Bounds

Example 15.1 from [1]

Solve:

Subject to:

Solution:

First solve without defining gradient (finite difference gradient will be used):

def objective(x):
    return 0.5*(x[0]-2)**2+0.5*(x[1]-0.5)**2

def ineq_constraint(x):
    return 1/(x[0]+1)-x[1]-0.25

# Use np.inf of -np.inf to define a single sided constraint
# If there are more than one constraint, that constraints will 
# be a list containing all of the constraints
constraints = NonlinearConstraint(ineq_constraint, 0, np.inf)

# set bounds on the variables
# only a lower bound is needed so the upper bound for both variables is set to np.inf
bounds = Bounds([0,0], [np.inf, np.inf])

# define starting point for optimization
x0 = np.array([5.0, 1.0])

res = minimize(objective, x0, bounds=bounds, constraints=constraints)

print("Solution =", res.x)
print(f"Obtained using {res.nfev} objective function evaluations.")
Solution = [1.95282327 0.08865882]
Obtained using 42 objective function evaluations.

Now define the gradient for objective and constraint and check gradients:

def objective_gradient(x):
    return np.array([(x[0]-2), (x[1]-0.5)])

def ineq_gradient(x):
    return np.array([-1/((x[0]+1)**2), -1])

# check analytical gradients against finite difference gradient
# an incorrect analytical gradient is a common cause for lack of convergence to a true minimum
for x in np.random.uniform(low=[0,0], high=[10,10], size=(5,2)):
    print("objective difference: ", check_grad(objective, objective_gradient, x))
    print("constraint difference:", check_grad(ineq_constraint, ineq_gradient, x))
objective difference:  7.24810320719611e-08
constraint difference: 2.1805555505335916e-08
objective difference:  1.5409355031965243e-08
constraint difference: 1.8387489794657874e-10
objective difference:  8.16340974645582e-08
constraint difference: 2.2211865402521624e-08
objective difference:  1.51975085661403e-07
constraint difference: 5.070987015715067e-10
objective difference:  1.7113557964841567e-07
constraint difference: 4.981334539820581e-08

Finally, minimize using the gradient functions that were just test:

constraints = NonlinearConstraint(ineq_constraint, 0, np.inf, jac=ineq_gradient)

res = minimize(objective, x0, jac=objective_gradient, bounds=bounds, constraints=constraints)

print("Solution =", res.x)
print(f"Obtained using {res.nfev} objective function evaluations.")
Solution = [1.95282328 0.08865881]
Obtained using 14 objective function evaluations.

Example 2: Nonlinear Equality Constraint

Example 15.2 from [1]

Solve:

Subject to:

Solution:

objective2 = lambda x: x[0]**2 + x[1]**2
objective2_gradient = lambda x: np.array([2*x[0], 2*x[1]])

eq_constraint = lambda x: (x[0]-1)**3 - x[1]**2
eq_gradient = lambda x: np.array([3*(x[0]-1)**2, -2*x[1]]) 

# Make the upper and lower bound both zero to define an equality constraint
constraints = NonlinearConstraint(eq_constraint, 0, 0, jac=eq_gradient) 

x0 = np.array([5, 2])

res = minimize(objective2, x0, jac=objective2_gradient, constraints=constraints)

print("Solution =", res.x)
print(f"Obtained using {res.nfev} objective function evaluations.")
Solution = [9.99966899e-01 3.36074169e-09]
Obtained using 181 objective function evaluations.

Example 3: Linear Constraint

Example problem from [2]

Solve:

Subject to:

Solution:

objective3 = lambda x: 100.0*(x[1] - x[0]**2)**2.0 + (1 - x[0])**2

objective3_gradient = lambda x: np.array([-400*(x[1]-x[0]**2)*x[0]-2*(1-x[0]),
                                          200*(x[1]-x[0]**2)])

# define the linear constraint
A = np.array([[1,2]])
constraints = LinearConstraint(A, [-np.inf], [1])

x0 = np.array([-1, 2])

res = minimize(objective3, x0, jac=objective3_gradient, constraints=constraints)

print("Solution =", res.x)
print(f"Obtained using {res.nfev} objective function evaluations.")
Solution = [0.50220246 0.24889838]
Obtained using 45 objective function evaluations.

Example 4: Unconstrained Optimization

Example problem from [3]

Solve:

Solution for :

def rosenbrock_function(x):
    result = 0

    for i in range(len(x) - 1):
        result += 100 * (x[i + 1] - x[i] ** 2) ** 2 + (1 - x[i]) ** 2

    return result

x0 = np.array([0.1, -0.5, -5.0])

res = minimize(rosenbrock_function, x0)

print("Solution =", res.x)
print(f"Obtained using {res.nfev} objective function evaluations.")
Solution = [0.99999729 0.99999458 0.99998915]
Obtained using 224 objective function evaluations.

References

[1] Nocedal, Jorge, and Stephen J. Wright. Numerical Optimization. 2nd ed. Springer Series in Operations Research. New York: Springer, 2006.

[2] https://www.mathworks.com/help/optim/ug/fmincon.html

[3] https://en.wikipedia.org/wiki/Rosenbrock_function

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

trust-constr-1.0.0.tar.gz (87.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

trust_constr-1.0.0-py3-none-any.whl (92.9 kB view details)

Uploaded Python 3

File details

Details for the file trust-constr-1.0.0.tar.gz.

File metadata

  • Download URL: trust-constr-1.0.0.tar.gz
  • Upload date:
  • Size: 87.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.8.5

File hashes

Hashes for trust-constr-1.0.0.tar.gz
Algorithm Hash digest
SHA256 e79e5330caa229ee69714faf5e6968e6e33a384d3e439a6ebbc4852f4b5e2731
MD5 96ef35b9db1b824761be8b784d552833
BLAKE2b-256 81f06eb0de8b32b3b31b0d179a898c5b21db387ade27609cc94b1d6ec1f10b61

See more details on using hashes here.

File details

Details for the file trust_constr-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: trust_constr-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 92.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.8.5

File hashes

Hashes for trust_constr-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 108e994b9bbf60195f277a45b6a5d45d59d8b8b17b1c53539a9fc710fac78d94
MD5 5a903a1193a4d05f62003bddfddfe60c
BLAKE2b-256 65fd3514f4bd4f94a0d16ff7143256c73dacb58fd9e08154f9cab4dd8981a8b9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page