| Home | Trees | Index | Help |
|---|
| Package numdifftools :: Module core :: Class Jacobian |
|
object --+
|
Common_diff_par --+
|
Jacobian
Estimate Jacobian matrix, with error estimate
Input arguments
===============
fun = function to differentiate.
**kwds
------
derOrder : Derivative order is always 1
metOrder : Integer from 1 to 4 defining order of basic method used.
(For 'central' methods, it must be from the set [2,4].
(Default 2)
method : Method of estimation. Valid options are:
'central', 'forward' or 'backwards'. (Default 'central')
numTerms : Number of Romberg terms used in the extrapolation.
Must be an integer from 0 to 3. (Default 2)
Note: 0 disables the Romberg step completely.
stepFix : If not None, it will define the maximum excursion from x0
that is used and prevent the adaptive logic from working.
This will be considerably faster, but not necessarily
as accurate as allowing the adaptive logic to run.
(Default: None)
stepMax : Maximum allowed excursion from x0 as a multiple of x0. (Default 100)
stepRatio: Ratio used between sequential steps in the estimation
of the derivative (Default 2)
vectorized : True - if your function is vectorized.
False - loop over the successive function calls (default).
Uses a semi-adaptive scheme to provide the best estimate of the
derivative by its automatic choice of a differencing interval. It uses
finite difference approximations of various orders, coupled with a
generalized (multiple term) Romberg extrapolation. This also yields the
error estimate provided. See the document DERIVEST.pdf for more explanation
of the algorithms behind the parameters.
Note on metOrder: higher order methods will generally be more accurate,
but may also suffer more from numerical problems. First order
methods would usually not be recommended.
Note on method: Central difference methods are usually the most accurate,
but sometimes one can only allow evaluation in forward or backward
direction.
The Jacobian matrix is the matrix of all first-order partial derivatives
of a vector-valued function.
Assumptions
-----------
fun : (vector valued)
analytical function to differentiate.
fun must be a function of the vector or array x0.
x0 : vector location at which to differentiate fun
If x0 is an N x M array, then fun is assumed to be
a function of N*M variables.
Examples
--------
#(nonlinear least squares)
>>> xdata = np.reshape(np.arange(0,1,0.1),(-1,1))
>>> ydata = 1+2*np.exp(0.75*xdata)
>>> fun = lambda c: (c[0]+c[1]*np.exp(c[2]*xdata) - ydata)**2
>>> Jfun = Jacobian(fun)
>>> Jfun([1,2,0.75]) # should be numerically zero
array([[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00],
[ 0.00000000e+00, 0.00000000e+00, -1.30229526e-17],
[ 0.00000000e+00, -2.12916532e-17, 6.35877095e-17],
[ 0.00000000e+00, 0.00000000e+00, 6.95367972e-19],
[ 0.00000000e+00, 0.00000000e+00, -2.13524915e-17],
[ 0.00000000e+00, -3.08563327e-16, 7.43577440e-16],
[ 0.00000000e+00, 1.16128292e-15, 1.71041646e-15],
[ 0.00000000e+00, 0.00000000e+00, -5.51592310e-16],
[ 0.00000000e+00, -4.51138245e-19, 1.90866225e-15],
[ -2.40861944e-19, -1.82530534e-15, -4.02819694e-15]])
>>> Jfun.error_estimate
array([[ 2.22044605e-15, 2.22044605e-15, 2.22044605e-15],
[ 2.22044605e-15, 2.22044605e-15, 2.22044605e-15],
[ 2.22044605e-15, 2.22044605e-15, 2.22044605e-15],
[ 2.22044605e-15, 2.22044605e-15, 2.22044605e-15],
[ 2.22044605e-15, 2.22044605e-15, 2.22044605e-15],
[ 2.22044605e-15, 2.22044605e-15, 2.22044605e-15],
[ 2.22044605e-15, 2.22044605e-15, 2.22044605e-15],
[ 2.22044605e-15, 2.22044605e-15, 6.17089138e-15],
[ 2.22044605e-15, 2.22044605e-15, 2.74306254e-15],
[ 2.22044605e-15, 2.22044605e-15, 2.22044605e-15]])
See also
--------
Gradient,
Derivative,
Hessdiag,
Hessian
| Method Summary | |
|---|---|
__call__(self,
x00)
| |
Return Jacobian matrix of a vector valued function of n variables Parameter --------- x0 : vector location at which to differentiate fun. | |
| Inherited from object | |
x.__delattr__('name') <==> del x.name | |
x.__getattribute__('name') <==> x.name | |
x.__hash__() <==> hash(x) | |
helper for pickle | |
helper for pickle | |
x.__repr__() <==> repr(x) | |
x.__setattr__('name', value) <==> x.name = value | |
x.__str__() <==> str(x) | |
| Inherited from type | |
T.__new__(S, ...) -> a new object with type S, a subtype of T | |
| Method Details |
|---|
jacobian(self, x00)
Return Jacobian matrix of a vector valued function of n variables
Parameter
---------
x0 : vector
location at which to differentiate fun.
If x0 is an nxm array, then fun is assumed to be
a function of n*m variables.
Member variable used
--------------------
fun : (vector valued) analytical function to differentiate.
fun must be a function of the vector or array x0.
Returns
-------
jac : array-like
first partial derivatives of fun. Assuming that x0
is a vector of length p and fun returns a vector
of length n, then jac will be an array of size (n,p)
err - vector
of error estimates corresponding to each partial
derivative in jac.
See also
--------
Derivative,
Gradient,
Hessian,
Hessdiag
|
| Home | Trees | Index | Help |
|---|
| Generated by Epydoc 2.0 on Sat Nov 22 02:00:54 2008 | http://epydoc.sf.net |