Skip to main content

Solves automatic numerical differentiation problems in one or more variables.

Project description

Suite of tools to solve automatic numerical differentiation problems in one or more variables. All of these methods also produce error estimates on the result. A pdf file is also provided to explain the theory behind these tools.

To test if the toolbox is working paste the following in an interactive python session:

import numdifftools as nd
nd.test(coverage=True)

Derivative:

A flexible tool for the computation of derivatives of order 1 through 4 on any scalar function. Finite differences are used in an adaptive manner, coupled with a Romberg extrapolation methodology to provide a maximally accurate result. The user can configure many of the options, changing the order of the method or the extrapolation, even allowing the user to specify whether central, forward or backward differences are used.

Gradient

Computes the gradient vector of a scalar function of one or more variables at any location.

Jacobian

Computes the Jacobian matrix of a vector (or array) valued function of one or more variables.

Hessian

Computes the Hessian matrix of all 2nd partial derivatives of a scalar function of one or more variables.

Hessdiag

The diagonal elements of the Hessian matrix are the pure second order partial derivatives.

Examples

Compute 1’st and 2’nd derivative of exp(x), at x == 1:

>>> import numpy as np
>>> import numdifftools as nd
>>> fd = nd.Derivative(np.exp)              # 1'st derivative
>>> fdd = nd.Derivative(np.exp, n=2)  # 2'nd derivative
>>> fd(1)
array([ 2.71828183])

Nonlinear least squares:

>>> xdata = np.reshape(np.arange(0,1,0.1),(-1,1))
>>> ydata = 1+2*np.exp(0.75*xdata)
>>> fun = lambda c: (c[0]+c[1]*np.exp(c[2]*xdata) - ydata)**2
>>> Jfun = nd.Jacobian(fun)
>>> np.abs(Jfun([1,2,0.75])) < 1e-14 # should be numerically zero
array([[ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True],
       [ True,  True,  True]], dtype=bool)

Compute gradient of sum(x**2):

>>> fun = lambda x: np.sum(x**2)
>>> dfun = nd.Gradient(fun)
>>> dfun([1,2,3])
array([ 2.,  4.,  6.])

See also

scipy.misc.derivative

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Numdifftools-0.4.0.zip (170.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

Numdifftools-0.4.0.win32.exe (230.5 kB view details)

Uploaded Source

File details

Details for the file Numdifftools-0.4.0.zip.

File metadata

  • Download URL: Numdifftools-0.4.0.zip
  • Upload date:
  • Size: 170.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for Numdifftools-0.4.0.zip
Algorithm Hash digest
SHA256 2263a7a5a261cf1b4036f173af29637e511650e1546d38814a5973c806cd0b87
MD5 944981e22d56ff299f73124ea1e62edc
BLAKE2b-256 218f1c1a867e6ae6532875eccaa0eb2f427ff2ffe9dfd880d83a3fae94d1d5dd

See more details on using hashes here.

File details

Details for the file Numdifftools-0.4.0.win32.exe.

File metadata

File hashes

Hashes for Numdifftools-0.4.0.win32.exe
Algorithm Hash digest
SHA256 e12b59187c1f06e8b4c9cad77a49df7bf8d31eec3aaff6eba7eac2a2ba242688
MD5 06166e83cfdf7fcf6f4fbb82819d3f84
BLAKE2b-256 07e1a10be415f49cff93051b9e93df8c2c6a157159ee053702311902aecdefe2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page