tools.numdiff.approx_fprime_cs()

statsmodels.tools.numdiff.approx_fprime_cs

statsmodels.tools.numdiff.approx_fprime_cs(x, f, epsilon=None, args=(), kwargs={}) [source]

Calculate gradient or Jacobian with complex step derivative approximation

Parameters:

x : array

parameters at which the derivative is evaluated

f : function

f(*((x,)+args), **kwargs) returning either one value or 1d array

epsilon : float, optional

Stepsize, if None, optimal stepsize is used. Optimal step-size is EPS*x. See note.

args : tuple

Tuple of additional arguments for function f.

kwargs : dict

Dictionary of additional keyword arguments for function f.

Returns:

partials : ndarray

array of partial derivatives, Gradient or Jacobian

Notes

The complex-step derivative has truncation error O(epsilon**2), so truncation error can be eliminated by choosing epsilon to be very small. The complex-step derivative avoids the problem of round-off error with small epsilon because there is no subtraction.

© 2009–2012 Statsmodels Developers
© 2006–2008 Scipy Developers
© 2006 Jonathan E. Taylor
Licensed under the 3-clause BSD License.
http://www.statsmodels.org/stable/generated/statsmodels.tools.numdiff.approx_fprime_cs.html

在线笔记
App下载
App下载

扫描二维码

下载编程狮App

公众号
微信公众号

编程狮公众号

意见反馈
返回顶部