symnum


Namesymnum JSON
Version 0.2.0 PyPI version JSON
download
home_page
SummarySymbolically construct NumPy functions and their derivatives.
upload_time2023-08-16 14:47:20
maintainer
docs_urlNone
author
requires_python>=3.9
license
keywords sympy numpy symbolic differentiation
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
<div style="text-align: center;" align="center">

<img src="https://raw.githubusercontent.com/matt-graham/symnum/main/images/logomark-dark-background.svg" alt="SymNum logo" width="120"/>

<h1>SymNum</h1>

<a href="https://badge.fury.io/py/symnum">
  <img src="https://badge.fury.io/py/symnum.svg" alt="PyPI version"/>
</a>
<a href="https://github.com/matt-graham/symnum/actions/workflows/tests.yml">
  <img src="https://github.com/matt-graham/symnum/actions/workflows/tests.yml/badge.svg" alt="Test status" />
</a>
<a href="https://matt-graham.github.io/symnum">
  <img src="https://github.com/matt-graham/symnum/actions/workflows/docs.yml/badge.svg" alt="Documentation status" />
</a>
</div>


## What is SymNum?

SymNum is a Python package that acts a bridge between
[NumPy](https://numpy.org/) and [SymPy](https://www.sympy.org/), providing a
NumPy-like interface that can be used to symbolically define functions which
take arrays as arguments and return arrays or scalars as values. A series of
[Autograd](https://github.com/HIPS/autograd) style functional differential
operators are also provided to construct derivatives of symbolic functions,
with the option to generate NumPy code to numerically evaluate these derivative
functions.

## Why use SymNum instead of Autograd or JAX?

SymNum is intended for use in generating the derivatives of 'simple' functions
which **compose a relatively small number of operations** and act on **small
array inputs**. By reducing interpreter overheads it can produce code which is
cheaper to evaluate than corresponding
[Autograd](https://github.com/HIPS/autograd)  or
[JAX](https://github.com/google/jax) functions (including those using  [JIT
compilation](https://jax.readthedocs.io/en/latest/notebooks/quickstart.html#Using-jit-to-speed-up-functions))
in such cases, and which can be serialised with the inbuilt Python `pickle`
library allowing use for example in libraries which use `multiprocessing` to
implement parallelisation across multiple processes.

The original motivating use case for SymNum was to allow automatically
constructing the  derivatives of the sorts of functions of low dimensional
inputs which are  commonly used as toy examples to demonstrate inference and
optimisation algorithms. In these cases while manually deriving and
implementing derivatives is generally possible, this can still be labourious
and error prone, and distract from the purpose of giving a simple show case of
an algorithm. On the other hand the derivative functions produced by Autograd
and JAX in such cases are often much slower than manual implementations. SymNum
tries to fill this gap by providing the flexibility and ease of use that comes
from automatic differentiation while still being efficient for small toy
examples.


## Doesn't SymPy already have array support and allow export of NumPy functions?

Yes: SymNum is mainly a convenience wrapper around functionality already
provided by SymPy to make it easier to use for those already familiar with
NumPy and Autograd / JAX. Specifically SymPy has several inbuilt array like
classes, which can be broadly split in to the [array
types](https://docs.sympy.org/latest/modules/tensor/array.html) defined  in
`sympy.tensor.array` and the  [matrix
types](https://docs.sympy.org/latest/modules/matrices/matrices.html)  defined
in `sympy.matrices`.

Each of the inbuilt array and matrix classes supports some of the functionality
of NumPy's core `ndarray` class, however both have some issues which means they
don't provide an easy drop-in replacement, with for example matrix classes
being limited to two-dimensions, while both the inbuilt array and matrix
classes do not support the full broadcasting and operator overloading semantics
of NumPy arrays. The `SymbolicArray` class in `symnum.array` aims to provide a
more `ndarray` like interface, supporting broadcasting of elementwise binary
arithmetic operations like `*`, `/`, `+` and `-`, elementwise NumPy ufunc-like
mathematical functions like `numpy.log` via the `symnum.numpy` module, simple
array contractions over potentially multiple axes with the `sum` and `prod` 
methods and matrix multiplication with the `@` operator.

Similarly SymPy has extensive built in [code generation](https://docs.sympy.org/latest/modules/codegen.html) 
features, including the
[`lambdify`](https://docs.sympy.org/latest/modules/utilities/lambdify.html) 
function which supports generation of functions which operate on
NumPy arrays. It can be non-trivial however to use these functions to generate
code which perform indexing operations on array inputs, or to construct higher
order functions which return [closures](https://en.wikipedia.org/wiki/Closure_(computer_programming)). 
SymNum builds on top of the SymPy's code generation functionality to allow
simpler generation of NumPy functions using such features.


## Example

```Python
import numpy as np
import symnum.numpy as snp
from symnum import named_array, numpify_func, jacobian

# Define a function using the symnum.numpy interface.
def func(x):
    return (snp.array([[1., -0.5], [-2., 3.]]) @ 
            snp.array([snp.cos(-x[1]**2 + 3 * x[0]), snp.sin(x[0] - 1)]))

# Create a named symbolic array to act as input and evaluate func symbolically.
x = named_array(name='x', shape=2)
y = func(x)

# Alternatively we can symbolically 'trace' func and use this to generate a
# NumPy function which accepts ndarray arguments. To allow the tracing we
# need to manually specify the shapes of the arguments to the function.
x_np = np.array([0.2, 1.1])
func_np = numpify_func(func, x.shape)
y_np = func_np(x_np)

# We can also use a similar approach to generate a NumPy function to evaluate
# the Jacobian of func on ndarray arguments. The numpified function func_np 
# stores the symbolic function used to generate it and details of the argument
# shapes and so we can pass it as a sole argument to jacobian without
# specifying the argument shapes.
jacob_func_np = jacobian(func_np)
dy_dx_np = jacob_func_np(x_np)
```

See also the [demo Jupyter notebook](https://github.com/matt-graham/symnum/blob/main/Demo.ipynb).



## Current limitations

SymNum only supports a small subset of the NumPy API at the moment. A
non-exhaustive list of things that don't currently work

  * Indexed / sliced assignment to arrays e.g. `a[i, j] = x` and `a[:, j] = y`
  * Matrix multiplication with `@` of arrays with dimensions > 2.
  * Linear algebra operations in `numpy.linalg` and FFT functions in `numpy.fft`.
  * All `scipy` functions such as the special functions in `scipy.special`.
  * Similar to the limitations on using [Python control flow with the JIT
    transformation in JAX](https://jax.readthedocs.io/en/latest/notebooks/Common_Gotchas_in_JAX.html#%F0%9F%94%AA-Control-Flow),
    the symbolic tracing of functions with SymNum requires that only control
    flows that does not depend on the value of array arguments is used.

Some of these are not fundamental limitations and SymNum's coverage will 
improve (pull requests are very welcome!), however as the focus is on 
allowing automatic generation of derivatives of simple functions of smallish
arrays if your use case uses more complex NumPy features you are likely to 
find Autograd or JAX to be better bets.

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "symnum",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": "",
    "keywords": "sympy,numpy,symbolic,differentiation",
    "author": "",
    "author_email": "Matt Graham <m.graham@ucl.ac.uk>",
    "download_url": "https://files.pythonhosted.org/packages/d6/e2/424d6eff4fe7e928365bcfd2aabb95d66fdc8ab4c538d050ee229eb68c04/symnum-0.2.0.tar.gz",
    "platform": null,
    "description": "\n<div style=\"text-align: center;\" align=\"center\">\n\n<img src=\"https://raw.githubusercontent.com/matt-graham/symnum/main/images/logomark-dark-background.svg\" alt=\"SymNum logo\" width=\"120\"/>\n\n<h1>SymNum</h1>\n\n<a href=\"https://badge.fury.io/py/symnum\">\n  <img src=\"https://badge.fury.io/py/symnum.svg\" alt=\"PyPI version\"/>\n</a>\n<a href=\"https://github.com/matt-graham/symnum/actions/workflows/tests.yml\">\n  <img src=\"https://github.com/matt-graham/symnum/actions/workflows/tests.yml/badge.svg\" alt=\"Test status\" />\n</a>\n<a href=\"https://matt-graham.github.io/symnum\">\n  <img src=\"https://github.com/matt-graham/symnum/actions/workflows/docs.yml/badge.svg\" alt=\"Documentation status\" />\n</a>\n</div>\n\n\n## What is SymNum?\n\nSymNum is a Python package that acts a bridge between\n[NumPy](https://numpy.org/) and [SymPy](https://www.sympy.org/), providing a\nNumPy-like interface that can be used to symbolically define functions which\ntake arrays as arguments and return arrays or scalars as values. A series of\n[Autograd](https://github.com/HIPS/autograd) style functional differential\noperators are also provided to construct derivatives of symbolic functions,\nwith the option to generate NumPy code to numerically evaluate these derivative\nfunctions.\n\n## Why use SymNum instead of Autograd or JAX?\n\nSymNum is intended for use in generating the derivatives of 'simple' functions\nwhich **compose a relatively small number of operations** and act on **small\narray inputs**. By reducing interpreter overheads it can produce code which is\ncheaper to evaluate than corresponding\n[Autograd](https://github.com/HIPS/autograd)  or\n[JAX](https://github.com/google/jax) functions (including those using  [JIT\ncompilation](https://jax.readthedocs.io/en/latest/notebooks/quickstart.html#Using-jit-to-speed-up-functions))\nin such cases, and which can be serialised with the inbuilt Python `pickle`\nlibrary allowing use for example in libraries which use `multiprocessing` to\nimplement parallelisation across multiple processes.\n\nThe original motivating use case for SymNum was to allow automatically\nconstructing the  derivatives of the sorts of functions of low dimensional\ninputs which are  commonly used as toy examples to demonstrate inference and\noptimisation algorithms. In these cases while manually deriving and\nimplementing derivatives is generally possible, this can still be labourious\nand error prone, and distract from the purpose of giving a simple show case of\nan algorithm. On the other hand the derivative functions produced by Autograd\nand JAX in such cases are often much slower than manual implementations. SymNum\ntries to fill this gap by providing the flexibility and ease of use that comes\nfrom automatic differentiation while still being efficient for small toy\nexamples.\n\n\n## Doesn't SymPy already have array support and allow export of NumPy functions?\n\nYes: SymNum is mainly a convenience wrapper around functionality already\nprovided by SymPy to make it easier to use for those already familiar with\nNumPy and Autograd / JAX. Specifically SymPy has several inbuilt array like\nclasses, which can be broadly split in to the [array\ntypes](https://docs.sympy.org/latest/modules/tensor/array.html) defined  in\n`sympy.tensor.array` and the  [matrix\ntypes](https://docs.sympy.org/latest/modules/matrices/matrices.html)  defined\nin `sympy.matrices`.\n\nEach of the inbuilt array and matrix classes supports some of the functionality\nof NumPy's core `ndarray` class, however both have some issues which means they\ndon't provide an easy drop-in replacement, with for example matrix classes\nbeing limited to two-dimensions, while both the inbuilt array and matrix\nclasses do not support the full broadcasting and operator overloading semantics\nof NumPy arrays. The `SymbolicArray` class in `symnum.array` aims to provide a\nmore `ndarray` like interface, supporting broadcasting of elementwise binary\narithmetic operations like `*`, `/`, `+` and `-`, elementwise NumPy ufunc-like\nmathematical functions like `numpy.log` via the `symnum.numpy` module, simple\narray contractions over potentially multiple axes with the `sum` and `prod` \nmethods and matrix multiplication with the `@` operator.\n\nSimilarly SymPy has extensive built in [code generation](https://docs.sympy.org/latest/modules/codegen.html) \nfeatures, including the\n[`lambdify`](https://docs.sympy.org/latest/modules/utilities/lambdify.html) \nfunction which supports generation of functions which operate on\nNumPy arrays. It can be non-trivial however to use these functions to generate\ncode which perform indexing operations on array inputs, or to construct higher\norder functions which return [closures](https://en.wikipedia.org/wiki/Closure_(computer_programming)). \nSymNum builds on top of the SymPy's code generation functionality to allow\nsimpler generation of NumPy functions using such features.\n\n\n## Example\n\n```Python\nimport numpy as np\nimport symnum.numpy as snp\nfrom symnum import named_array, numpify_func, jacobian\n\n# Define a function using the symnum.numpy interface.\ndef func(x):\n    return (snp.array([[1., -0.5], [-2., 3.]]) @ \n            snp.array([snp.cos(-x[1]**2 + 3 * x[0]), snp.sin(x[0] - 1)]))\n\n# Create a named symbolic array to act as input and evaluate func symbolically.\nx = named_array(name='x', shape=2)\ny = func(x)\n\n# Alternatively we can symbolically 'trace' func and use this to generate a\n# NumPy function which accepts ndarray arguments. To allow the tracing we\n# need to manually specify the shapes of the arguments to the function.\nx_np = np.array([0.2, 1.1])\nfunc_np = numpify_func(func, x.shape)\ny_np = func_np(x_np)\n\n# We can also use a similar approach to generate a NumPy function to evaluate\n# the Jacobian of func on ndarray arguments. The numpified function func_np \n# stores the symbolic function used to generate it and details of the argument\n# shapes and so we can pass it as a sole argument to jacobian without\n# specifying the argument shapes.\njacob_func_np = jacobian(func_np)\ndy_dx_np = jacob_func_np(x_np)\n```\n\nSee also the [demo Jupyter notebook](https://github.com/matt-graham/symnum/blob/main/Demo.ipynb).\n\n\n\n## Current limitations\n\nSymNum only supports a small subset of the NumPy API at the moment. A\nnon-exhaustive list of things that don't currently work\n\n  * Indexed / sliced assignment to arrays e.g. `a[i, j] = x` and `a[:, j] = y`\n  * Matrix multiplication with `@` of arrays with dimensions > 2.\n  * Linear algebra operations in `numpy.linalg` and FFT functions in `numpy.fft`.\n  * All `scipy` functions such as the special functions in `scipy.special`.\n  * Similar to the limitations on using [Python control flow with the JIT\n    transformation in JAX](https://jax.readthedocs.io/en/latest/notebooks/Common_Gotchas_in_JAX.html#%F0%9F%94%AA-Control-Flow),\n    the symbolic tracing of functions with SymNum requires that only control\n    flows that does not depend on the value of array arguments is used.\n\nSome of these are not fundamental limitations and SymNum's coverage will \nimprove (pull requests are very welcome!), however as the focus is on \nallowing automatic generation of derivatives of simple functions of smallish\narrays if your use case uses more complex NumPy features you are likely to \nfind Autograd or JAX to be better bets.\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Symbolically construct NumPy functions and their derivatives.",
    "version": "0.2.0",
    "project_urls": {
        "documentation": "https://matt-graham.github.io/symnum",
        "homepage": "https://github.com/matt-graham/symnum"
    },
    "split_keywords": [
        "sympy",
        "numpy",
        "symbolic",
        "differentiation"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "dd02660a40d632d3f471571a6cca2a7c5ffdab446343754b847ee0cb8050b8f3",
                "md5": "f5ff1341d16b99888a44f31b866d8280",
                "sha256": "26c4d06d8c9662231c646821b811ad01c176cefc2802471154396c2c7bb75068"
            },
            "downloads": -1,
            "filename": "symnum-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "f5ff1341d16b99888a44f31b866d8280",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 21536,
            "upload_time": "2023-08-16T14:47:19",
            "upload_time_iso_8601": "2023-08-16T14:47:19.217538Z",
            "url": "https://files.pythonhosted.org/packages/dd/02/660a40d632d3f471571a6cca2a7c5ffdab446343754b847ee0cb8050b8f3/symnum-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d6e2424d6eff4fe7e928365bcfd2aabb95d66fdc8ab4c538d050ee229eb68c04",
                "md5": "3096c5868942bfd7c995d824eb696e42",
                "sha256": "d81817f9df6e79e4f3e3493651a457953c74b7ff62d7fa3041180ea02a3d54c9"
            },
            "downloads": -1,
            "filename": "symnum-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "3096c5868942bfd7c995d824eb696e42",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 30017,
            "upload_time": "2023-08-16T14:47:20",
            "upload_time_iso_8601": "2023-08-16T14:47:20.519468Z",
            "url": "https://files.pythonhosted.org/packages/d6/e2/424d6eff4fe7e928365bcfd2aabb95d66fdc8ab4c538d050ee229eb68c04/symnum-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-08-16 14:47:20",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "matt-graham",
    "github_project": "symnum",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "symnum"
}
        
Elapsed time: 0.10198s