algopy


Namealgopy JSON
Version 0.7.2 PyPI version JSON
download
home_pagehttps://packages.python.org/algopy
SummaryALGOPY: Taylor Arithmetic Computation and Algorithmic Differentiation
upload_time2024-07-07 08:52:00
maintainerNone
docs_urlhttps://pythonhosted.org/algopy/
authorSebastian F. Walter
requires_python>=3.6
licenseBSD
keywords algorithmic differentiation computational differentiation automatic differentiation forward mode reverse mode taylor arithmetic
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI
coveralls test coverage No coveralls.
            AlgoPy, a library for Automatic Differentation (AD) in Python
-------------------------------------------------------------

Description:
    AlgoPy allows you to differentiate functions implemented as computer programs
    by using Algorithmic Differentiation (AD) techniques in the forward and
    reverse mode.

    The forward mode propagates univariate Taylor polynomials of arbitrary order.
    Hence it is also possible to use AlgoPy to evaluate higher-order derivative tensors.

    The reverse mode is also known as backpropagation and can be found in similar form in tools like PyTorch.

    Speciality of AlgoPy is the possibility to differentiate functions that contain
    matrix functions as +,-,*,/, dot, solve, qr, eigh, cholesky.


Rationale:
    Many programs for scientific computing make use of numerical linear algebra.
    The defacto standard for array manipulations in Python is NumPy.
    AlgoPy allows you to write code that can either be evaluated by NumPy, or with
    AlgoPy with little or no modifications to your code.

    Note that this does not mean that any code you wrote can be differentiated with AlgoPy,
    but rather that you can write code that can be evaluated with or without AlgoPy.


Documentation:
    Available at http://packages.python.org/algopy/

    For more documentation have a look at:
        1) the talks in the ./documentation folder
        2) the examples in the ./documentation/examples folder
        3) sphinx documenation ./documentation/sphinx and run `make`


Example:
    Compute directional derivatives of the function f(J)::

        import numpy
        from algopy import UTPM, qr, solve, dot, eigh

        def f(x):
            N,M = x.shape
            Q,R = qr(x)
            Id = numpy.eye(M)
            Rinv = solve(R,Id)
            C = dot(Rinv,Rinv.T)
            l,U = eigh(C)
            return l[0]

        x = UTPM.init_jacobian(numpy.random.random((50,10)))
        y = f(x)
        J = UTPM.extract_jacobian(y)

        print 'Jacobian dy/dx =', J

Installation:

    see http://packages.python.org/algopy/


Features:

    Univariate Taylor Propagation:

        * Univariate Taylor Propagation on Matrices (UTPM)
          Implementation in: `algopy.utpm`
        * Exact Interpolation of Higher Order Derivative Tensors:
          (Hessians, etc.)

    Reverse Mode:

        ALGOPY also features functionality for convenient differentiation of a given
        algorithm. For that, the sequence of operation is recorded by tracing the
        evaluation of the algorithm. Implementation in: `./algopy/tracer.py`

Testing:

    Uses numpy testing facilities. Simply run::

        $ python -c "import algopy; algopy.test()"


Alternatives:

    There are nowadays many alternatives like `PYTORCH`_ which provide a more efficient way for backpropagation on CPU/GPUs.

    For AD in Python you can also have a look at

        * `PYADOLC`_ a Python wrapper for ADOL-C (C++)
        * `PYCPPAD`_ a Python wrapper for  CppAD (C++)

    However, their support for differentiation of Numerical Linear Algebra (NLA)
    functions is only very limited.

    .. _PYADOLC: http://www.github.com/b45ch1/pyadolc
    .. _PYCPPAD: http://www.github.com/b45ch1/pycppad
    .. _PYTORCH: https://pytorch.org/

Email:

    sebastian.walter@gmail.com

How to cite AlgoPy::

    @article{Walter2011,
    title = "Algorithmic differentiation in Python with AlgoPy",
    journal = "Journal of Computational Science",
    volume = "",
    number = "0",
    pages = " - ",
    year = "2011",
    note = "",
    issn = "1877-7503",
    doi = "10.1016/j.jocs.2011.10.007",
    url = "http://www.sciencedirect.com/science/article/pii/S1877750311001013",
    author = "Sebastian F. Walter and Lutz Lehmann",
    keywords = "Automatic differentiation",
    keywords = "Cholesky decomposition",
    keywords = "Hierarchical approach",
    keywords = "Higher-order derivatives",
    keywords = "Numerical linear algebra",
    keywords = "NumPy",
    keywords = "Taylor arithmetic"
    }


-------------------------------------------------------------------------------

Licence:
    BSD style using http://www.opensource.org/licenses/bsd-license.php template
    as it was on 2009-01-24 with the following substutions:

    * <YEAR> = 2008-2009
    * <OWNER> = Sebastian F. Walter, sebastian.walter@gmail.com
    * <ORGANIZATION> = contributors' organizations
    * In addition, "Neither the name of the contributors' organizations" was changed to "Neither the names of the contributors' organizations"


Copyright (c) 2008-2009, Seastian F. Walter
All rights reserved.

Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:

    * Redistributions of source code must retain the above copyright notice,
      this list of conditions and the following disclaimer.
    * Redistributions in binary form must reproduce the above copyright notice,
      this list of conditions and the following disclaimer in the documentation
      and/or other materials provided with the distribution.
    * Neither the names of the contributors' organizations nor the names of
      its contributors may be used to endorse or promote products derived from
      this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

            

Raw data

            {
    "_id": null,
    "home_page": "https://packages.python.org/algopy",
    "name": "algopy",
    "maintainer": null,
    "docs_url": "https://pythonhosted.org/algopy/",
    "requires_python": ">=3.6",
    "maintainer_email": "\"Sebastian F. Walter\" <sebastian.walter@gmail.com>",
    "keywords": "algorithmic differentiation, computational differentiation, automatic differentiation, forward mode, reverse mode, Taylor arithmetic",
    "author": "Sebastian F. Walter",
    "author_email": "Sebastian Walter <sebastian.walter@gmail.com>, Alexander Griffing <argriffi@ncsu.edu>",
    "download_url": "https://files.pythonhosted.org/packages/2a/df/34e00ef017c1d12de898ddb9305b7962bc69a79594331dce88d40a53fe0e/algopy-0.7.2.tar.gz",
    "platform": "all",
    "description": "AlgoPy, a library for Automatic Differentation (AD) in Python\n-------------------------------------------------------------\n\nDescription:\n    AlgoPy allows you to differentiate functions implemented as computer programs\n    by using Algorithmic Differentiation (AD) techniques in the forward and\n    reverse mode.\n\n    The forward mode propagates univariate Taylor polynomials of arbitrary order.\n    Hence it is also possible to use AlgoPy to evaluate higher-order derivative tensors.\n\n    The reverse mode is also known as backpropagation and can be found in similar form in tools like PyTorch.\n\n    Speciality of AlgoPy is the possibility to differentiate functions that contain\n    matrix functions as +,-,*,/, dot, solve, qr, eigh, cholesky.\n\n\nRationale:\n    Many programs for scientific computing make use of numerical linear algebra.\n    The defacto standard for array manipulations in Python is NumPy.\n    AlgoPy allows you to write code that can either be evaluated by NumPy, or with\n    AlgoPy with little or no modifications to your code.\n\n    Note that this does not mean that any code you wrote can be differentiated with AlgoPy,\n    but rather that you can write code that can be evaluated with or without AlgoPy.\n\n\nDocumentation:\n    Available at http://packages.python.org/algopy/\n\n    For more documentation have a look at:\n        1) the talks in the ./documentation folder\n        2) the examples in the ./documentation/examples folder\n        3) sphinx documenation ./documentation/sphinx and run `make`\n\n\nExample:\n    Compute directional derivatives of the function f(J)::\n\n        import numpy\n        from algopy import UTPM, qr, solve, dot, eigh\n\n        def f(x):\n            N,M = x.shape\n            Q,R = qr(x)\n            Id = numpy.eye(M)\n            Rinv = solve(R,Id)\n            C = dot(Rinv,Rinv.T)\n            l,U = eigh(C)\n            return l[0]\n\n        x = UTPM.init_jacobian(numpy.random.random((50,10)))\n        y = f(x)\n        J = UTPM.extract_jacobian(y)\n\n        print 'Jacobian dy/dx =', J\n\nInstallation:\n\n    see http://packages.python.org/algopy/\n\n\nFeatures:\n\n    Univariate Taylor Propagation:\n\n        * Univariate Taylor Propagation on Matrices (UTPM)\n          Implementation in: `algopy.utpm`\n        * Exact Interpolation of Higher Order Derivative Tensors:\n          (Hessians, etc.)\n\n    Reverse Mode:\n\n        ALGOPY also features functionality for convenient differentiation of a given\n        algorithm. For that, the sequence of operation is recorded by tracing the\n        evaluation of the algorithm. Implementation in: `./algopy/tracer.py`\n\nTesting:\n\n    Uses numpy testing facilities. Simply run::\n\n        $ python -c \"import algopy; algopy.test()\"\n\n\nAlternatives:\n\n    There are nowadays many alternatives like `PYTORCH`_ which provide a more efficient way for backpropagation on CPU/GPUs.\n\n    For AD in Python you can also have a look at\n\n        * `PYADOLC`_ a Python wrapper for ADOL-C (C++)\n        * `PYCPPAD`_ a Python wrapper for  CppAD (C++)\n\n    However, their support for differentiation of Numerical Linear Algebra (NLA)\n    functions is only very limited.\n\n    .. _PYADOLC: http://www.github.com/b45ch1/pyadolc\n    .. _PYCPPAD: http://www.github.com/b45ch1/pycppad\n    .. _PYTORCH: https://pytorch.org/\n\nEmail:\n\n    sebastian.walter@gmail.com\n\nHow to cite AlgoPy::\n\n    @article{Walter2011,\n    title = \"Algorithmic differentiation in Python with AlgoPy\",\n    journal = \"Journal of Computational Science\",\n    volume = \"\",\n    number = \"0\",\n    pages = \" - \",\n    year = \"2011\",\n    note = \"\",\n    issn = \"1877-7503\",\n    doi = \"10.1016/j.jocs.2011.10.007\",\n    url = \"http://www.sciencedirect.com/science/article/pii/S1877750311001013\",\n    author = \"Sebastian F. Walter and Lutz Lehmann\",\n    keywords = \"Automatic differentiation\",\n    keywords = \"Cholesky decomposition\",\n    keywords = \"Hierarchical approach\",\n    keywords = \"Higher-order derivatives\",\n    keywords = \"Numerical linear algebra\",\n    keywords = \"NumPy\",\n    keywords = \"Taylor arithmetic\"\n    }\n\n\n-------------------------------------------------------------------------------\n\nLicence:\n    BSD style using http://www.opensource.org/licenses/bsd-license.php template\n    as it was on 2009-01-24 with the following substutions:\n\n    * <YEAR> = 2008-2009\n    * <OWNER> = Sebastian F. Walter, sebastian.walter@gmail.com\n    * <ORGANIZATION> = contributors' organizations\n    * In addition, \"Neither the name of the contributors' organizations\" was changed to \"Neither the names of the contributors' organizations\"\n\n\nCopyright (c) 2008-2009, Seastian F. Walter\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without modification,\nare permitted provided that the following conditions are met:\n\n    * Redistributions of source code must retain the above copyright notice,\n      this list of conditions and the following disclaimer.\n    * Redistributions in binary form must reproduce the above copyright notice,\n      this list of conditions and the following disclaimer in the documentation\n      and/or other materials provided with the distribution.\n    * Neither the names of the contributors' organizations nor the names of\n      its contributors may be used to endorse or promote products derived from\n      this software without specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n",
    "bugtrack_url": null,
    "license": "BSD",
    "summary": "ALGOPY: Taylor Arithmetic Computation and Algorithmic Differentiation",
    "version": "0.7.2",
    "project_urls": {
        "Changelog": "https://github.com/b45ch1/algopy/blob/master/CHANGELOG.rst",
        "Documentation": "https://pythonhosted.org/algopy/",
        "Homepage": "https://www.github.com/b45ch1/algopy",
        "Repository": "https://www.github.com/b45ch1/algopy"
    },
    "split_keywords": [
        "algorithmic differentiation",
        " computational differentiation",
        " automatic differentiation",
        " forward mode",
        " reverse mode",
        " taylor arithmetic"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2adf34e00ef017c1d12de898ddb9305b7962bc69a79594331dce88d40a53fe0e",
                "md5": "1d989be0c760992783c92d0a526d432e",
                "sha256": "4a82ba3d44964430733c5c44c498779ac84879cabf70b31136d550b5e6252d51"
            },
            "downloads": -1,
            "filename": "algopy-0.7.2.tar.gz",
            "has_sig": false,
            "md5_digest": "1d989be0c760992783c92d0a526d432e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 160544,
            "upload_time": "2024-07-07T08:52:00",
            "upload_time_iso_8601": "2024-07-07T08:52:00.120436Z",
            "url": "https://files.pythonhosted.org/packages/2a/df/34e00ef017c1d12de898ddb9305b7962bc69a79594331dce88d40a53fe0e/algopy-0.7.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-07 08:52:00",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "b45ch1",
    "github_project": "algopy",
    "travis_ci": true,
    "coveralls": false,
    "github_actions": false,
    "lcname": "algopy"
}
        
Elapsed time: 3.32054s