Name | ndtorch JSON |
Version |
0.1.7
JSON |
| download |
home_page | |
Summary | Higher order partial derivatives computation with respect to one or several tensor-like variables, application to nonlinear dynamics |
upload_time | 2023-08-31 17:23:13 |
maintainer | |
docs_url | None |
author | Ivan Morozov |
requires_python | >=3.10 |
license | MIT |
keywords |
torch
derivative
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
[![Documentation Status](https://readthedocs.org/projects/ndtorch/badge/?version=latest)](https://ndtorch.readthedocs.io/en/latest/?badge=latest)
# ndtorch, 2022-2023
<p align="center">
<img width="100" height="100" src="docs/pics/logo.svg">
</p>
Higher order partial derivatives computation with respect to one or several tensor-like variables.
Taylor series function approximation (derivative table and series function representation).
Parametric fixed point computation.
# Install & build
```
$ git clone https://github.com/i-a-morozov/ndtorch.git
$ cd ndtorch
$ python -m pip install .
```
# Derivative (composable jacobian)
Compute higher order function (partial) derivatives.
```python
>>> from ndtorch.derivative import derivative
>>> def fn(x):
... return 1 + x + x**2 + x**3 + x**4 + x**5
...
>>> import torch
>>> x = torch.tensor(0.0)
>>> derivative(5, fn, x)
[tensor(1.), tensor(1.), tensor(2.), tensor(6.), tensor(24.), tensor(120.)]
```
```python
>>> from ndtorch.derivative import derivative
>>> def fn(x):
... x1, x2 = x
... return x1**2 + x1*x2 + x2**2
...
>>> import torch
>>> x = torch.tensor([0.0, 0.0])
>>> derivative(2, fn, x, intermediate=False)
tensor([[2., 1.],
[1., 2.]])
```
```python
>>> from ndtorch.derivative import derivative
>>> def fn(x, y):
... x1, x2 = x
... return x1**2*(1 + y) + x2**2*(1 - y)
...
>>> import torch
>>> x = torch.tensor([0.0, 0.0])
>>> y = torch.tensor(0.0)
>>> derivative((2, 1), fn, x, y)
[[tensor(0.), tensor(0.)], [tensor([0., 0.]), tensor([0., 0.])], [tensor([[2., 0.],
[0., 2.]]), tensor([[ 2., 0.],
[ 0., -2.]])]]
```
# Derivative (gradient)
Compute higher order function (partial) derivatives.
```python
>>> from ndtorch.gradient import series
>>> def fn(x):
... return 1 + x + x**2 + x**3 + x**4 + x**5
...
>>> import torch
>>> x = torch.tensor([0.0])
>>> series((5, ), fn, x, retain=False, series=False)
{(0,): tensor([1.]),
(1,): tensor([1.]),
(2,): tensor([2.]),
(3,): tensor([6.]),
(4,): tensor([24.]),
(5,): tensor([120.])}
```
```python
>>> from ndtorch.gradient import series
>>> def fn(x):
... x1, x2 = x
... return x1**2 + x1*x2 + x2**2
...
>>> import torch
>>> x = torch.tensor([0.0, 0.0])
>>> series((2, ), fn, x, intermediate=False, retain=False, series=False)
{(2, 0): tensor(2.), (1, 1): tensor(1.), (0, 2): tensor(2.)}
```
```python
>>> from ndtorch.gradient import series
>>> def fn(x, y):
... x1, x2 = x
... y1, = y
... return x1**2*(1 + y1) + x2**2*(1 - y1)
...
>>> import torch
>>> x = torch.tensor([0.0, 0.0])
>>> y = torch.tensor([0.0])
>>> series((2, 1), fn, x, y, retain=False, series=False)
{(0, 0, 0): tensor(0.),
(0, 0, 1): tensor(0.),
(1, 0, 0): tensor(0.),
(0, 1, 0): tensor(0.),
(1, 0, 1): tensor(0.),
(0, 1, 1): tensor(-0.),
(2, 0, 0): tensor(2.),
(1, 1, 0): tensor(0.),
(0, 2, 0): tensor(2.),
(2, 0, 1): tensor(2.),
(1, 1, 1): tensor(0.),
(0, 2, 1): tensor(-2.)}
```
# Desription
```python
>>> import ndtorch
>>> ndtorch.__about__
```
# Animations
Stable and unstable invariant manifolds approximation
<p align="center">
<img width="576" height="576" src="docs/pics/manifold.gif">
</p>
Collision of fixed points
<p align="center">
<img width="576" height="576" src="docs/pics/collision.gif">
</p>
Reduce real part of a hyperbolic fixed point
<p align="center">
<img width="576" height="576" src="docs/pics/change.gif">
</p>
Raw data
{
"_id": null,
"home_page": "",
"name": "ndtorch",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": "",
"keywords": "torch,derivative",
"author": "Ivan Morozov",
"author_email": "",
"download_url": "https://files.pythonhosted.org/packages/ea/04/1a770f4436d018878ac2354056e770dfa445c52dcc3a49a59c57af410173/ndtorch-0.1.7.tar.gz",
"platform": null,
"description": "[![Documentation Status](https://readthedocs.org/projects/ndtorch/badge/?version=latest)](https://ndtorch.readthedocs.io/en/latest/?badge=latest)\n\n# ndtorch, 2022-2023\n\n<p align=\"center\">\n <img width=\"100\" height=\"100\" src=\"docs/pics/logo.svg\">\n</p>\n\nHigher order partial derivatives computation with respect to one or several tensor-like variables.\nTaylor series function approximation (derivative table and series function representation).\nParametric fixed point computation.\n\n# Install & build\n\n```\n$ git clone https://github.com/i-a-morozov/ndtorch.git\n$ cd ndtorch\n$ python -m pip install .\n\n```\n\n# Derivative (composable jacobian)\n\nCompute higher order function (partial) derivatives.\n\n```python\n>>> from ndtorch.derivative import derivative\n>>> def fn(x):\n... return 1 + x + x**2 + x**3 + x**4 + x**5\n... \n>>> import torch\n>>> x = torch.tensor(0.0)\n>>> derivative(5, fn, x)\n[tensor(1.), tensor(1.), tensor(2.), tensor(6.), tensor(24.), tensor(120.)]\n```\n\n```python\n>>> from ndtorch.derivative import derivative\n>>> def fn(x):\n... x1, x2 = x\n... return x1**2 + x1*x2 + x2**2\n... \n>>> import torch\n>>> x = torch.tensor([0.0, 0.0])\n>>> derivative(2, fn, x, intermediate=False)\ntensor([[2., 1.],\n [1., 2.]])\n```\n\n```python\n>>> from ndtorch.derivative import derivative\n>>> def fn(x, y):\n... x1, x2 = x\n... return x1**2*(1 + y) + x2**2*(1 - y)\n... \n>>> import torch\n>>> x = torch.tensor([0.0, 0.0])\n>>> y = torch.tensor(0.0)\n>>> derivative((2, 1), fn, x, y)\n[[tensor(0.), tensor(0.)], [tensor([0., 0.]), tensor([0., 0.])], [tensor([[2., 0.],\n [0., 2.]]), tensor([[ 2., 0.],\n [ 0., -2.]])]]\n```\n\n# Derivative (gradient)\n\nCompute higher order function (partial) derivatives.\n\n```python\n>>> from ndtorch.gradient import series\n>>> def fn(x):\n... return 1 + x + x**2 + x**3 + x**4 + x**5\n... \n>>> import torch\n>>> x = torch.tensor([0.0])\n>>> series((5, ), fn, x, retain=False, series=False)\n{(0,): tensor([1.]),\n (1,): tensor([1.]),\n (2,): tensor([2.]),\n (3,): tensor([6.]),\n (4,): tensor([24.]),\n (5,): tensor([120.])}\n```\n\n```python\n>>> from ndtorch.gradient import series\n>>> def fn(x):\n... x1, x2 = x\n... return x1**2 + x1*x2 + x2**2\n...\n>>> import torch\n>>> x = torch.tensor([0.0, 0.0])\n>>> series((2, ), fn, x, intermediate=False, retain=False, series=False)\n{(2, 0): tensor(2.), (1, 1): tensor(1.), (0, 2): tensor(2.)}\n\n```\n\n```python\n>>> from ndtorch.gradient import series\n>>> def fn(x, y):\n... x1, x2 = x\n... y1, = y\n... return x1**2*(1 + y1) + x2**2*(1 - y1)\n...\n>>> import torch\n>>> x = torch.tensor([0.0, 0.0])\n>>> y = torch.tensor([0.0])\n>>> series((2, 1), fn, x, y, retain=False, series=False)\n{(0, 0, 0): tensor(0.),\n (0, 0, 1): tensor(0.),\n (1, 0, 0): tensor(0.),\n (0, 1, 0): tensor(0.),\n (1, 0, 1): tensor(0.),\n (0, 1, 1): tensor(-0.),\n (2, 0, 0): tensor(2.),\n (1, 1, 0): tensor(0.),\n (0, 2, 0): tensor(2.),\n (2, 0, 1): tensor(2.),\n (1, 1, 1): tensor(0.),\n (0, 2, 1): tensor(-2.)}\n```\n\n# Desription\n\n```python\n>>> import ndtorch\n>>> ndtorch.__about__\n```\n\n# Animations\n\nStable and unstable invariant manifolds approximation\n\n<p align=\"center\">\n <img width=\"576\" height=\"576\" src=\"docs/pics/manifold.gif\">\n</p>\n\nCollision of fixed points\n\n<p align=\"center\">\n <img width=\"576\" height=\"576\" src=\"docs/pics/collision.gif\">\n</p>\n\nReduce real part of a hyperbolic fixed point\n\n<p align=\"center\">\n <img width=\"576\" height=\"576\" src=\"docs/pics/change.gif\">\n</p>\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Higher order partial derivatives computation with respect to one or several tensor-like variables, application to nonlinear dynamics",
"version": "0.1.7",
"project_urls": null,
"split_keywords": [
"torch",
"derivative"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "2cebe6595e9fd62110ab4ec5d744efbb89512cd4ba93203ff0e04826c0a92187",
"md5": "db95942fc6ed5e98327a613d9e7f38ce",
"sha256": "d8ebc5da1f671bc999aa407e5e288f6022785f361339375b49b507b0613c5f04"
},
"downloads": -1,
"filename": "ndtorch-0.1.7-py3-none-any.whl",
"has_sig": false,
"md5_digest": "db95942fc6ed5e98327a613d9e7f38ce",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 47648,
"upload_time": "2023-08-31T17:22:49",
"upload_time_iso_8601": "2023-08-31T17:22:49.743508Z",
"url": "https://files.pythonhosted.org/packages/2c/eb/e6595e9fd62110ab4ec5d744efbb89512cd4ba93203ff0e04826c0a92187/ndtorch-0.1.7-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "ea041a770f4436d018878ac2354056e770dfa445c52dcc3a49a59c57af410173",
"md5": "50f8475bb584a54f68225d48a5a0bee4",
"sha256": "1b61e85bb89da891028565a2b27d7868dd2f86969ee6bb544c755945c32a82e8"
},
"downloads": -1,
"filename": "ndtorch-0.1.7.tar.gz",
"has_sig": false,
"md5_digest": "50f8475bb584a54f68225d48a5a0bee4",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 8186959,
"upload_time": "2023-08-31T17:23:13",
"upload_time_iso_8601": "2023-08-31T17:23:13.374124Z",
"url": "https://files.pythonhosted.org/packages/ea/04/1a770f4436d018878ac2354056e770dfa445c52dcc3a49a59c57af410173/ndtorch-0.1.7.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-08-31 17:23:13",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "ndtorch"
}