<h1 align="center">sympytorch</h1>
Turn SymPy expressions into PyTorch Modules.
SymPy floats (optionally) become trainable parameters. SymPy symbols are inputs to the Module.
Optimise your symbolic expressions via gradient descent!
## Installation
```bash
pip install sympytorch
```
Requires Python 3.7+ and PyTorch 1.6.0+ and SymPy 1.7.1+.
## Example
```python
import sympy, torch, sympytorch
x = sympy.symbols('x_name')
cosx = 1.0 * sympy.cos(x)
sinx = 2.0 * sympy.sin(x)
mod = sympytorch.SymPyModule(expressions=[cosx, sinx])
x_ = torch.rand(3)
out = mod(x_name=x_) # out has shape (3, 2)
assert torch.equal(out[:, 0], x_.cos())
assert torch.equal(out[:, 1], 2 * x_.sin())
assert out.requires_grad # from the two Parameters initialised as 1.0 and 2.0
assert {x.item() for x in mod.parameters()} == {1.0, 2.0}
```
## API
```python
sympytorch.SymPyModule(*, expressions, extra_funcs=None)
```
Where:
- `expressions` is a list of SymPy expressions.
- `extra_funcs` is a dictionary mapping from custom `sympy.Function`s to their PyTorch implementation. Defaults to no extra functions.
Instances of `SymPyModule` can be called, passing the values of the symbols as in the above example.
`SymPyModule` has a method `.sympy()`, which returns the corresponding list of SymPy expressions. (Which may not be the same as the expressions it was initialised with, if the values of its Parameters have been changed, i.e. have been learnt.)
Wrapping floats in `sympy.UnevaluatedExpr` will cause them not to be trained, by registering them as buffers rather than parameters.
```python
sympytorch.hide_floats(expression)
```
As a convenience, `hide_floats` will take an expression and return a new expression with every float wrapped in a `sympy.UnevaluatedExpr`, so that it is interpreted as a buffer rather than a parameter.
## Extensions
Not every PyTorch or SymPy operation is supported -- just the ones that I found I've needed! There's a dictionary [here](./sympytorch/sympy_module.py#L12) that lists the supported operations. Feel free to submit PRs for any extra operations you think should be in by default. You can also use the `extra_funcs` argument to specify extra functions, including custom functions.
Raw data
{
"_id": null,
"home_page": "https://github.com/patrick-kidger/sympytorch",
"name": "sympytorch",
"maintainer": "Patrick Kidger",
"docs_url": null,
"requires_python": "~=3.7",
"maintainer_email": "contact@kidger.site",
"keywords": "",
"author": "Patrick Kidger",
"author_email": "contact@kidger.site",
"download_url": "https://files.pythonhosted.org/packages/c6/b6/6fd609f25c32c634763c659b529900a53290f4ca43a7d6eee1a80f843f83/sympytorch-0.1.4.tar.gz",
"platform": null,
"description": "<h1 align=\"center\">sympytorch</h1>\n\nTurn SymPy expressions into PyTorch Modules.\n\nSymPy floats (optionally) become trainable parameters. SymPy symbols are inputs to the Module.\n\nOptimise your symbolic expressions via gradient descent!\n\n## Installation\n\n```bash\npip install sympytorch\n```\nRequires Python 3.7+ and PyTorch 1.6.0+ and SymPy 1.7.1+.\n\n## Example\n\n```python\nimport sympy, torch, sympytorch\n\nx = sympy.symbols('x_name')\ncosx = 1.0 * sympy.cos(x)\nsinx = 2.0 * sympy.sin(x)\nmod = sympytorch.SymPyModule(expressions=[cosx, sinx])\n\nx_ = torch.rand(3)\nout = mod(x_name=x_) # out has shape (3, 2)\n\nassert torch.equal(out[:, 0], x_.cos())\nassert torch.equal(out[:, 1], 2 * x_.sin())\nassert out.requires_grad # from the two Parameters initialised as 1.0 and 2.0\nassert {x.item() for x in mod.parameters()} == {1.0, 2.0}\n```\n\n## API\n\n```python\nsympytorch.SymPyModule(*, expressions, extra_funcs=None)\n```\nWhere:\n- `expressions` is a list of SymPy expressions.\n- `extra_funcs` is a dictionary mapping from custom `sympy.Function`s to their PyTorch implementation. Defaults to no extra functions.\n\nInstances of `SymPyModule` can be called, passing the values of the symbols as in the above example.\n\n`SymPyModule` has a method `.sympy()`, which returns the corresponding list of SymPy expressions. (Which may not be the same as the expressions it was initialised with, if the values of its Parameters have been changed, i.e. have been learnt.)\n\nWrapping floats in `sympy.UnevaluatedExpr` will cause them not to be trained, by registering them as buffers rather than parameters.\n\n```python\nsympytorch.hide_floats(expression)\n```\nAs a convenience, `hide_floats` will take an expression and return a new expression with every float wrapped in a `sympy.UnevaluatedExpr`, so that it is interpreted as a buffer rather than a parameter.\n\n## Extensions\n\nNot every PyTorch or SymPy operation is supported -- just the ones that I found I've needed! There's a dictionary [here](./sympytorch/sympy_module.py#L12) that lists the supported operations. Feel free to submit PRs for any extra operations you think should be in by default. You can also use the `extra_funcs` argument to specify extra functions, including custom functions.\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Turning SymPy expressions into PyTorch modules.",
"version": "0.1.4",
"project_urls": {
"Homepage": "https://github.com/patrick-kidger/sympytorch"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "2ca4a13b437067981b1a46b8e28454ce39baf96da84f8bf8b46a85f9cc561041",
"md5": "d27122e112ff78d9733c32af10d380d5",
"sha256": "15759e6837ab1bddc8ef1c9238cfa089cb92bd06d93d443f9885b8f66783d463"
},
"downloads": -1,
"filename": "sympytorch-0.1.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d27122e112ff78d9733c32af10d380d5",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "~=3.7",
"size": 10009,
"upload_time": "2023-07-24T22:56:13",
"upload_time_iso_8601": "2023-07-24T22:56:13.842985Z",
"url": "https://files.pythonhosted.org/packages/2c/a4/a13b437067981b1a46b8e28454ce39baf96da84f8bf8b46a85f9cc561041/sympytorch-0.1.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "c6b66fd609f25c32c634763c659b529900a53290f4ca43a7d6eee1a80f843f83",
"md5": "7276ada5b34fd858ef37afe3f9a3de03",
"sha256": "99784c3f95d972a13ff67d8a3ffe506dbed0a423c731f0a6200f46d53dedfbcf"
},
"downloads": -1,
"filename": "sympytorch-0.1.4.tar.gz",
"has_sig": false,
"md5_digest": "7276ada5b34fd858ef37afe3f9a3de03",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "~=3.7",
"size": 9811,
"upload_time": "2023-07-24T22:56:14",
"upload_time_iso_8601": "2023-07-24T22:56:14.966101Z",
"url": "https://files.pythonhosted.org/packages/c6/b6/6fd609f25c32c634763c659b529900a53290f4ca43a7d6eee1a80f843f83/sympytorch-0.1.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-07-24 22:56:14",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "patrick-kidger",
"github_project": "sympytorch",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "sympytorch"
}