# Numdifftools

 Name Numdifftools JSON Version 0.9.20 JSON download home_page https://github.com/pbrod/numdifftools Summary Solves automatic numerical differentiation problems in one or more variables. upload_time 2017-01-11 15:43:05 maintainer docs_url None author pbrod requires_python license new BSD keywords VCS bugtrack_url requirements Travis-CI coveralls test coverage
Numdifftools
============

|pkg_img| |tests_img| |tests2_img| |docs_img| |health_img| |coverage_img| |versions_img| |depsy_img|

Numdifftools is a suite of tools written in `_Python <http://www.python.org/>`_
to solve automatic numerical differentiation problems in one or more variables.
Finite differences are used in an adaptive manner, coupled with a Richardson
extrapolation methodology to provide a maximally accurate result.
The user can configure many options like; changing the order of the method or
the extrapolation, even allowing the user to specify whether complex-step,
central, forward or backward differences are used.

The methods provided are:

- **Derivative**: Compute the derivatives of order 1 through 10 on any scalar function.

- **directionaldiff**: Compute directional derivative of a function of n variables

- **Gradient**: Compute the gradient vector of a scalar function of one or more variables.

- **Jacobian**: Compute the Jacobian matrix of a vector valued function of one or more variables.

- **Hessian**: Compute the Hessian matrix of all 2nd partial derivatives of a scalar function of one or more variables.

- **Hessdiag**: Compute only the diagonal elements of the Hessian matrix

All of these methods also produce error estimates on the result.

Numdifftools also provide an easy to use interface to derivatives calculated
with in `_AlgoPy <https://pythonhosted.org/algopy/>`_. Algopy stands for Algorithmic
Differentiation in Python.
The purpose of AlgoPy is the evaluation of higher-order derivatives in the
`forward` and `reverse` mode of Algorithmic Differentiation (AD) of functions
that are implemented as Python programs.

Getting Started
---------------

Visualize high order derivatives of the tanh function

>>> import numpy as np
>>> import numdifftools as nd
>>> import matplotlib.pyplot as plt
>>> x = np.linspace(-2, 2, 100)
>>> for i in range(10):
...    df = nd.Derivative(np.tanh, n=i)
...    y = df(x)
...    h = plt.plot(x, y/np.abs(y).max())

>>> plt.show()

.. image:: https://raw.githubusercontent.com/pbrod/numdifftools/master/examples/fun.png
:target: https://github.com/pbrod/numdifftools/blob/master/examples/fun.py

Compute 1'st and 2'nd derivative of exp(x), at x == 1::

>>> fd = nd.Derivative(np.exp)        # 1'st derivative
>>> fdd = nd.Derivative(np.exp, n=2)  # 2'nd derivative
>>> np.allclose(fd(1), 2.7182818284590424)
True
>>> np.allclose(fdd(1), 2.7182818284590424)
True

Nonlinear least squares::

>>> xdata = np.reshape(np.arange(0,1,0.1),(-1,1))
>>> ydata = 1+2*np.exp(0.75*xdata)
>>> fun = lambda c: (c[0]+c[1]*np.exp(c[2]*xdata) - ydata)**2
>>> Jfun = nd.Jacobian(fun)
>>> np.allclose(np.abs(Jfun([1,2,0.75])), 0) # should be numerically zero
True

Compute gradient of sum(x**2)::

>>> fun = lambda x: np.sum(x**2)
>>> dfun = nd.Gradient(fun)
>>> dfun([1,2,3])
array([ 2.,  4.,  6.])

Compute the same with the easy to use interface to AlgoPy::

>>> import numdifftools.nd_algopy as nda
>>> import numpy as np
>>> fd = nda.Derivative(np.exp)        # 1'st derivative
>>> fdd = nda.Derivative(np.exp, n=2)  # 2'nd derivative
>>> np.allclose(fd(1), 2.7182818284590424)
True
>>> np.allclose(fdd(1), 2.7182818284590424)
True

Nonlinear least squares::

>>> xdata = np.reshape(np.arange(0,1,0.1),(-1,1))
>>> ydata = 1+2*np.exp(0.75*xdata)
>>> fun = lambda c: (c[0]+c[1]*np.exp(c[2]*xdata) - ydata)**2
>>> Jfun = nda.Jacobian(fun, method='reverse')
>>> np.allclose(np.abs(Jfun([1,2,0.75])), 0) # should be numerically zero
True

Compute gradient of sum(x**2)::

>>> fun = lambda x: np.sum(x**2)
>>> dfun = nda.Gradient(fun)
>>> dfun([1,2,3])
array([ 2.,  4.,  6.])

See also
--------
scipy.misc.derivative

Documentation and code
======================

Numdifftools works on Python 2.7+ and Python 3.0+.

Official releases available at: http://pypi.python.org/pypi/numdifftools |pkg_img|

Official documentation available at: http://numdifftools.readthedocs.io/en/latest/ |docs_img|

Bleeding edge: https://github.com/pbrod/numdifftools.

Installation
============

If you have pip installed, then simply type:

\$ pip install numdifftools

to get the lastest stable version. Using pip also has the advantage that all
requirements are automatically installed.

Unit tests
==========
To test if the toolbox is working paste the following in an interactive
python session::

import numdifftools as nd
nd.test(coverage=True, doctests=True)

Acknowledgement
===============
The `numdifftools package <http://pypi.python.org/pypi/numdifftools/>`_ for
`Python <https://www.python.org/>`_ was written by Per A. Brodtkorb
based on the adaptive numerical differentiation toolbox written in
`Matlab <http://www.mathworks.com>`_  by John D'Errico [DErrico2006]_.

Numdifftools has as of version 0.9 been extended with some of the functionality
found in the statsmodels.tools.numdiff module written by Josef Perktold
[Perktold2014]_.

References
===========

.. [DErrico2006] D'Errico, J. R.  (2006),
Adaptive Robust Numerical Differentiation
http://www.mathworks.com/matlabcentral/fileexchange/13490-adaptive-robust-numerical-differentiation

.. [Perktold2014] Perktold, J (2014), numdiff package
http://statsmodels.sourceforge.net/0.6.0/_modules/statsmodels/tools/numdiff.html

.. [Lantoine2010] Gregory Lantoine (2010),
A methodology for robust optimization of low-thrust trajectories in
multi-body environments, Phd thesis, Georgia Institute of Technology

.. [LantoineEtal2012] Gregory Lantoine, R.P. Russell, and T. Dargent (2012)
Using multicomplex variables for automatic computation of high-order
derivatives, ACM Transactions on Mathematical Software,
Vol. 38, No. 3, Article 16, April 2012, 21 pages,
http://doi.acm.org/10.1145/2168773.2168774

.. [Luna-ElizarrarasEtal2012] M.E. Luna-Elizarraras, M. Shapiro, D.C. Struppa1,
A. Vajiac (2012), CUBO A Mathematical Journal,
Vol. 14, No 2, (61-80). June 2012.

.. [Verheyleweghen2014] Adriaen Verheyleweghen, (2014)
Computation of higher-order derivatives using the multi-complex step method,
Project report, NTNU

.. |pkg_img| image:: https://badge.fury.io/py/numdifftools.png
:target: https://pypi.python.org/pypi/Numdifftools/

.. |tests_img| image:: https://travis-ci.org/pbrod/numdifftools.svg?branch=master
:target: https://travis-ci.org/pbrod/numdifftools

.. |tests2_img| image:: https://ci.appveyor.com/api/projects/status/qeoegaocw41lkarv/branch/master?svg=true
:target: https://ci.appveyor.com/project/pbrod/numdifftools

.. |docs_img| image:: https://readthedocs.org/projects/pip/badge/?version=latest
:target: http://numdifftools.readthedocs.org/en/latest/

.. |health_img| image:: https://landscape.io/github/pbrod/numdifftools/master/landscape.svg?style=flat
:target: https://landscape.io/github/pbrod/numdifftools/master
:alt: Code Health

.. |coverage_img| image:: https://coveralls.io/repos/pbrod/numdifftools/badge.svg?branch=master
:target: https://coveralls.io/github/pbrod/numdifftools?branch=master

.. |versions_img| image:: https://img.shields.io/pypi/pyversions/numdifftools.svg
:target: https://github.com/pbrod/numdifftools

.. |depsy_img| image:: http://depsy.org/api/package/pypi/Numdifftools/badge.svg
:target: http://depsy.org/package/python/Numdifftools

### Raw data

{
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"cheesecake_code_kwalitee_id": null,
"keywords": "",
"landscape": true,
"upload_time": "2017-01-11 15:43:05",
"requirements": [
{
"name": "numpy",
"specs": [
[
">=",
"1.9"
]
]
},
{
"name": "scipy",
"specs": [
[
">=",
"0.8"
]
]
},
{
"name": "algopy",
"specs": [
[
">=",
"0.4"
]
]
},
{
"name": "setuptools",
"specs": [
[
">=",
"9.0"
]
]
}
],
"author": "pbrod",
"home_page": "https://github.com/pbrod/numdifftools",
"github_user": "pbrod",
"appveyor": true,
"download_url": "https://pypi.python.org/packages/70/25/1af07d513fc5fb60b80c4577971fdd3480f931291f783dde33de178b185f/numdifftools-0.9.20.tar.gz",
"platform": "",
"version": "0.9.20",
"cheesecake_documentation_id": null,
"description": "Numdifftools\r\n============\r\n\r\n|pkg_img| |tests_img| |tests2_img| |docs_img| |health_img| |coverage_img| |versions_img| |depsy_img|\r\n\r\n\r\nNumdifftools is a suite of tools written in `_Python <http://www.python.org/>`_\r\nto solve automatic numerical differentiation problems in one or more variables.\r\nFinite differences are used in an adaptive manner, coupled with a Richardson\r\nextrapolation methodology to provide a maximally accurate result.\r\nThe user can configure many options like; changing the order of the method or\r\nthe extrapolation, even allowing the user to specify whether complex-step,\r\ncentral, forward or backward differences are used.\r\n\r\nThe methods provided are:\r\n\r\n- **Derivative**: Compute the derivatives of order 1 through 10 on any scalar function.\r\n\r\n- **directionaldiff**: Compute directional derivative of a function of n variables\r\n\r\n- **Gradient**: Compute the gradient vector of a scalar function of one or more variables.\r\n\r\n- **Jacobian**: Compute the Jacobian matrix of a vector valued function of one or more variables.\r\n\r\n- **Hessian**: Compute the Hessian matrix of all 2nd partial derivatives of a scalar function of one or more variables.\r\n\r\n- **Hessdiag**: Compute only the diagonal elements of the Hessian matrix\r\n\r\nAll of these methods also produce error estimates on the result.\r\n\r\nNumdifftools also provide an easy to use interface to derivatives calculated\r\nwith in `_AlgoPy <https://pythonhosted.org/algopy/>`_. Algopy stands for Algorithmic\r\nDifferentiation in Python.\r\nThe purpose of AlgoPy is the evaluation of higher-order derivatives in the\r\n`forward` and `reverse` mode of Algorithmic Differentiation (AD) of functions\r\nthat are implemented as Python programs.\r\n\r\n\r\nGetting Started\r\n---------------\r\n\r\nVisualize high order derivatives of the tanh function\r\n\r\n    >>> import numpy as np\r\n    >>> import numdifftools as nd\r\n    >>> import matplotlib.pyplot as plt\r\n    >>> x = np.linspace(-2, 2, 100)\r\n    >>> for i in range(10):\r\n    ...    df = nd.Derivative(np.tanh, n=i)\r\n    ...    y = df(x)\r\n    ...    h = plt.plot(x, y/np.abs(y).max())\r\n\r\n    >>> plt.show()\r\n\r\n.. image:: https://raw.githubusercontent.com/pbrod/numdifftools/master/examples/fun.png\r\n    :target: https://github.com/pbrod/numdifftools/blob/master/examples/fun.py\r\n\r\n\r\n\r\nCompute 1'st and 2'nd derivative of exp(x), at x == 1::\r\n\r\n    >>> fd = nd.Derivative(np.exp)        # 1'st derivative\r\n    >>> fdd = nd.Derivative(np.exp, n=2)  # 2'nd derivative\r\n    >>> np.allclose(fd(1), 2.7182818284590424)\r\n    True\r\n    >>> np.allclose(fdd(1), 2.7182818284590424)\r\n    True\r\n\r\nNonlinear least squares::\r\n\r\n    >>> xdata = np.reshape(np.arange(0,1,0.1),(-1,1))\r\n    >>> ydata = 1+2*np.exp(0.75*xdata)\r\n    >>> fun = lambda c: (c[0]+c[1]*np.exp(c[2]*xdata) - ydata)**2\r\n    >>> Jfun = nd.Jacobian(fun)\r\n    >>> np.allclose(np.abs(Jfun([1,2,0.75])), 0) # should be numerically zero\r\n    True\r\n\r\nCompute gradient of sum(x**2)::\r\n\r\n    >>> fun = lambda x: np.sum(x**2)\r\n    >>> dfun = nd.Gradient(fun)\r\n    >>> dfun([1,2,3])\r\n    array([ 2.,  4.,  6.])\r\n\r\nCompute the same with the easy to use interface to AlgoPy::\r\n\r\n    >>> import numdifftools.nd_algopy as nda\r\n    >>> import numpy as np\r\n    >>> fd = nda.Derivative(np.exp)        # 1'st derivative\r\n    >>> fdd = nda.Derivative(np.exp, n=2)  # 2'nd derivative\r\n    >>> np.allclose(fd(1), 2.7182818284590424)\r\n    True\r\n    >>> np.allclose(fdd(1), 2.7182818284590424)\r\n    True\r\n\r\nNonlinear least squares::\r\n\r\n    >>> xdata = np.reshape(np.arange(0,1,0.1),(-1,1))\r\n    >>> ydata = 1+2*np.exp(0.75*xdata)\r\n    >>> fun = lambda c: (c[0]+c[1]*np.exp(c[2]*xdata) - ydata)**2\r\n    >>> Jfun = nda.Jacobian(fun, method='reverse')\r\n    >>> np.allclose(np.abs(Jfun([1,2,0.75])), 0) # should be numerically zero\r\n    True\r\n\r\nCompute gradient of sum(x**2)::\r\n\r\n    >>> fun = lambda x: np.sum(x**2)\r\n    >>> dfun = nda.Gradient(fun)\r\n    >>> dfun([1,2,3])\r\n    array([ 2.,  4.,  6.])\r\n\r\n\r\nSee also\r\n--------\r\nscipy.misc.derivative\r\n\r\n\r\nDocumentation and code\r\n======================\r\n\r\nNumdifftools works on Python 2.7+ and Python 3.0+.\r\n\r\nOfficial releases available at: http://pypi.python.org/pypi/numdifftools |pkg_img|\r\n\r\nOfficial documentation available at: http://numdifftools.readthedocs.io/en/latest/ |docs_img|\r\n\r\nBleeding edge: https://github.com/pbrod/numdifftools.\r\n\r\n\r\nInstallation\r\n============\r\n\r\nIf you have pip installed, then simply type:\r\n\r\n    \$ pip install numdifftools\r\n\r\nto get the lastest stable version. Using pip also has the advantage that all\r\nrequirements are automatically installed.\r\n\r\n\r\nUnit tests\r\n==========\r\nTo test if the toolbox is working paste the following in an interactive\r\npython session::\r\n\r\n   import numdifftools as nd\r\n   nd.test(coverage=True, doctests=True)\r\n\r\n\r\nAcknowledgement\r\n===============\r\nThe `numdifftools package <http://pypi.python.org/pypi/numdifftools/>`_ for\r\n`Python <https://www.python.org/>`_ was written by Per A. Brodtkorb\r\nbased on the adaptive numerical differentiation toolbox written in\r\n`Matlab <http://www.mathworks.com>`_  by John D'Errico [DErrico2006]_.\r\n\r\nNumdifftools has as of version 0.9 been extended with some of the functionality\r\nfound in the statsmodels.tools.numdiff module written by Josef Perktold\r\n[Perktold2014]_.\r\n\r\n\r\nReferences\r\n===========\r\n\r\n.. [DErrico2006] D'Errico, J. R.  (2006),\r\n    Adaptive Robust Numerical Differentiation\r\n    http://www.mathworks.com/matlabcentral/fileexchange/13490-adaptive-robust-numerical-differentiation\r\n\r\n.. [Perktold2014] Perktold, J (2014), numdiff package\r\n    http://statsmodels.sourceforge.net/0.6.0/_modules/statsmodels/tools/numdiff.html\r\n\r\n.. [Lantoine2010] Gregory Lantoine (2010),\r\n    A methodology for robust optimization of low-thrust trajectories in\r\n    multi-body environments, Phd thesis, Georgia Institute of Technology\r\n\r\n.. [LantoineEtal2012] Gregory Lantoine, R.P. Russell, and T. Dargent (2012)\r\n    Using multicomplex variables for automatic computation of high-order\r\n    derivatives, ACM Transactions on Mathematical Software,\r\n    Vol. 38, No. 3, Article 16, April 2012, 21 pages,\r\n    http://doi.acm.org/10.1145/2168773.2168774\r\n\r\n.. [Luna-ElizarrarasEtal2012] M.E. Luna-Elizarraras, M. Shapiro, D.C. Struppa1,\r\n    A. Vajiac (2012), CUBO A Mathematical Journal,\r\n    Vol. 14, No 2, (61-80). June 2012.\r\n\r\n.. [Verheyleweghen2014] Adriaen Verheyleweghen, (2014)\r\n    Computation of higher-order derivatives using the multi-complex step method,\r\n    Project report, NTNU\r\n\r\n\r\n.. |pkg_img| image:: https://badge.fury.io/py/numdifftools.png\r\n    :target: https://pypi.python.org/pypi/Numdifftools/\r\n\r\n.. |tests_img| image:: https://travis-ci.org/pbrod/numdifftools.svg?branch=master\r\n    :target: https://travis-ci.org/pbrod/numdifftools\r\n\r\n.. |tests2_img| image:: https://ci.appveyor.com/api/projects/status/qeoegaocw41lkarv/branch/master?svg=true\r\n    :target: https://ci.appveyor.com/project/pbrod/numdifftools\r\n\r\n\r\n.. |docs_img| image:: https://readthedocs.org/projects/pip/badge/?version=latest\r\n    :target: http://numdifftools.readthedocs.org/en/latest/\r\n\r\n.. |health_img| image:: https://landscape.io/github/pbrod/numdifftools/master/landscape.svg?style=flat\r\n   :target: https://landscape.io/github/pbrod/numdifftools/master\r\n   :alt: Code Health\r\n\r\n.. |coverage_img| image:: https://coveralls.io/repos/pbrod/numdifftools/badge.svg?branch=master\r\n   :target: https://coveralls.io/github/pbrod/numdifftools?branch=master\r\n\r\n.. |versions_img| image:: https://img.shields.io/pypi/pyversions/numdifftools.svg\r\n   :target: https://github.com/pbrod/numdifftools\r\n\r\n.. |depsy_img| image:: http://depsy.org/api/package/pypi/Numdifftools/badge.svg\r\n   :target: http://depsy.org/package/python/Numdifftools\r\n\r\n\r\n\r\n",
"tox": true,
"lcname": "numdifftools",
"bugtrack_url": "",
"github": true,
"coveralls": true,
"name": "Numdifftools",
"license": "new BSD",
"travis_ci": true,
"github_project": "numdifftools",
"summary": "Solves automatic numerical differentiation problems in one or more variables.",
"split_keywords": [],
"author_email": "per.andreas.brodtkorb@gmail.com",
"urls": [
{
"has_sig": false,
"upload_time": "2017-01-11T15:42:38",
"comment_text": "",
"python_version": "py2.py3",
"url": "https://pypi.python.org/packages/3f/b2/a492a473c3bcdfe8889986beb331e7198a718282de8c0ec8c9fec947878c/numdifftools-0.9.20-py2.py3-none-any.whl",
"md5_digest": "9a4dc965e20d4fa8b0fd983ae970063e",
"downloads": 0,
"filename": "numdifftools-0.9.20-py2.py3-none-any.whl",
"packagetype": "bdist_wheel",
"path": "3f/b2/a492a473c3bcdfe8889986beb331e7198a718282de8c0ec8c9fec947878c/numdifftools-0.9.20-py2.py3-none-any.whl",
"size": 3207442
},
{
"has_sig": false,
"upload_time": "2017-01-11T15:43:05",
"comment_text": "",
"python_version": "source",
"url": "https://pypi.python.org/packages/70/25/1af07d513fc5fb60b80c4577971fdd3480f931291f783dde33de178b185f/numdifftools-0.9.20.tar.gz",
"md5_digest": "c0545b8d1d1b4cdd1d5dacd99ff42666",
"downloads": 0,
"filename": "numdifftools-0.9.20.tar.gz",
"packagetype": "sdist",
"path": "70/25/1af07d513fc5fb60b80c4577971fdd3480f931291f783dde33de178b185f/numdifftools-0.9.20.tar.gz",
"size": 3049850
}
],
"_id": null,
"cheesecake_installability_id": null
}