torchsde


Nametorchsde JSON
Version 0.2.6 PyPI version JSON
download
home_pagehttps://github.com/google-research/torchsde
SummarySDE solvers and stochastic adjoint sensitivity analysis in PyTorch.
upload_time2023-09-26 21:52:20
maintainer
docs_urlNone
authorXuechen Li, Patrick Kidger
requires_python>=3.8
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # PyTorch Implementation of Differentiable SDE Solvers ![Python package](https://github.com/google-research/torchsde/workflows/Python%20package/badge.svg?branch=dev)
This library provides [stochastic differential equation (SDE)](https://en.wikipedia.org/wiki/Stochastic_differential_equation) solvers with GPU support and efficient backpropagation.

---
<p align="center">
  <img width="600" height="450" src="./assets/latent_sde.gif">
</p>

## Installation
```shell script
pip install torchsde
```

**Requirements:** Python >=3.8 and PyTorch >=1.6.0.

## Documentation
Available [here](./DOCUMENTATION.md).

## Examples
### Quick example
```python
import torch
import torchsde

batch_size, state_size, brownian_size = 32, 3, 2
t_size = 20

class SDE(torch.nn.Module):
    noise_type = 'general'
    sde_type = 'ito'

    def __init__(self):
        super().__init__()
        self.mu = torch.nn.Linear(state_size, 
                                  state_size)
        self.sigma = torch.nn.Linear(state_size, 
                                     state_size * brownian_size)

    # Drift
    def f(self, t, y):
        return self.mu(y)  # shape (batch_size, state_size)

    # Diffusion
    def g(self, t, y):
        return self.sigma(y).view(batch_size, 
                                  state_size, 
                                  brownian_size)

sde = SDE()
y0 = torch.full((batch_size, state_size), 0.1)
ts = torch.linspace(0, 1, t_size)
# Initial state y0, the SDE is solved over the interval [ts[0], ts[-1]].
# ys will have shape (t_size, batch_size, state_size)
ys = torchsde.sdeint(sde, y0, ts)
```

### Notebook

[`examples/demo.ipynb`](examples/demo.ipynb) gives a short guide on how to solve SDEs, including subtle points such as fixing the randomness in the solver and the choice of *noise types*.

### Latent SDE

[`examples/latent_sde.py`](examples/latent_sde.py) learns a *latent stochastic differential equation*, as in Section 5 of [\[1\]](https://arxiv.org/pdf/2001.01328.pdf).
The example fits an SDE to data, whilst regularizing it to be like an [Ornstein-Uhlenbeck](https://en.wikipedia.org/wiki/Ornstein%E2%80%93Uhlenbeck_process) prior process.
The model can be loosely viewed as a [variational autoencoder](https://en.wikipedia.org/wiki/Autoencoder#Variational_autoencoder_(VAE)) with its prior and approximate posterior being SDEs. This example can be run via
```shell script
python -m examples.latent_sde --train-dir <TRAIN_DIR>
```
The program outputs figures to the path specified by `<TRAIN_DIR>`.
Training should stabilize after 500 iterations with the default hyperparameters.

### Neural SDEs as GANs
[`examples/sde_gan.py`](examples/sde_gan.py) learns an SDE as a GAN, as in [\[2\]](https://arxiv.org/abs/2102.03657), [\[3\]](https://arxiv.org/abs/2105.13493). The example trains an SDE as the generator of a GAN, whilst using a [neural CDE](https://github.com/patrick-kidger/NeuralCDE) [\[4\]](https://arxiv.org/abs/2005.08926) as the discriminator. This example can be run via

```shell script
python -m examples.sde_gan
```

## Citation

If you found this codebase useful in your research, please consider citing either or both of:

```
@article{li2020scalable,
  title={Scalable gradients for stochastic differential equations},
  author={Li, Xuechen and Wong, Ting-Kam Leonard and Chen, Ricky T. Q. and Duvenaud, David},
  journal={International Conference on Artificial Intelligence and Statistics},
  year={2020}
}
```

```
@article{kidger2021neuralsde,
  title={Neural {SDE}s as {I}nfinite-{D}imensional {GAN}s},
  author={Kidger, Patrick and Foster, James and Li, Xuechen and Oberhauser, Harald and Lyons, Terry},
  journal={International Conference on Machine Learning},
  year={2021}
}
```

## References

\[1\] Xuechen Li, Ting-Kam Leonard Wong, Ricky T. Q. Chen, David Duvenaud. "Scalable Gradients for Stochastic Differential Equations". *International Conference on Artificial Intelligence and Statistics.* 2020. [[arXiv]](https://arxiv.org/pdf/2001.01328.pdf)

\[2\] Patrick Kidger, James Foster, Xuechen Li, Harald Oberhauser, Terry Lyons. "Neural SDEs as Infinite-Dimensional GANs". *International Conference on Machine Learning* 2021. [[arXiv]](https://arxiv.org/abs/2102.03657)

\[3\] Patrick Kidger, James Foster, Xuechen Li, Terry Lyons. "Efficient and Accurate Gradients for Neural SDEs". 2021. [[arXiv]](https://arxiv.org/abs/2105.13493)

\[4\] Patrick Kidger, James Morrill, James Foster, Terry Lyons, "Neural Controlled Differential Equations for Irregular Time Series". *Neural Information Processing Systems* 2020. [[arXiv]](https://arxiv.org/abs/2005.08926)

---
This is a research project, not an official Google product. 

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/google-research/torchsde",
    "name": "torchsde",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "",
    "keywords": "",
    "author": "Xuechen Li, Patrick Kidger",
    "author_email": "lxuechen@cs.stanford.edu, hello@kidger.site",
    "download_url": "https://files.pythonhosted.org/packages/71/a5/ae18ee6de023b3a5462122a43a4c9812c11d275cc585a3d08bf24945c02a/torchsde-0.2.6.tar.gz",
    "platform": null,
    "description": "# PyTorch Implementation of Differentiable SDE Solvers ![Python package](https://github.com/google-research/torchsde/workflows/Python%20package/badge.svg?branch=dev)\nThis library provides [stochastic differential equation (SDE)](https://en.wikipedia.org/wiki/Stochastic_differential_equation) solvers with GPU support and efficient backpropagation.\n\n---\n<p align=\"center\">\n  <img width=\"600\" height=\"450\" src=\"./assets/latent_sde.gif\">\n</p>\n\n## Installation\n```shell script\npip install torchsde\n```\n\n**Requirements:** Python >=3.8 and PyTorch >=1.6.0.\n\n## Documentation\nAvailable [here](./DOCUMENTATION.md).\n\n## Examples\n### Quick example\n```python\nimport torch\nimport torchsde\n\nbatch_size, state_size, brownian_size = 32, 3, 2\nt_size = 20\n\nclass SDE(torch.nn.Module):\n    noise_type = 'general'\n    sde_type = 'ito'\n\n    def __init__(self):\n        super().__init__()\n        self.mu = torch.nn.Linear(state_size, \n                                  state_size)\n        self.sigma = torch.nn.Linear(state_size, \n                                     state_size * brownian_size)\n\n    # Drift\n    def f(self, t, y):\n        return self.mu(y)  # shape (batch_size, state_size)\n\n    # Diffusion\n    def g(self, t, y):\n        return self.sigma(y).view(batch_size, \n                                  state_size, \n                                  brownian_size)\n\nsde = SDE()\ny0 = torch.full((batch_size, state_size), 0.1)\nts = torch.linspace(0, 1, t_size)\n# Initial state y0, the SDE is solved over the interval [ts[0], ts[-1]].\n# ys will have shape (t_size, batch_size, state_size)\nys = torchsde.sdeint(sde, y0, ts)\n```\n\n### Notebook\n\n[`examples/demo.ipynb`](examples/demo.ipynb) gives a short guide on how to solve SDEs, including subtle points such as fixing the randomness in the solver and the choice of *noise types*.\n\n### Latent SDE\n\n[`examples/latent_sde.py`](examples/latent_sde.py) learns a *latent stochastic differential equation*, as in Section 5 of [\\[1\\]](https://arxiv.org/pdf/2001.01328.pdf).\nThe example fits an SDE to data, whilst regularizing it to be like an [Ornstein-Uhlenbeck](https://en.wikipedia.org/wiki/Ornstein%E2%80%93Uhlenbeck_process) prior process.\nThe model can be loosely viewed as a [variational autoencoder](https://en.wikipedia.org/wiki/Autoencoder#Variational_autoencoder_(VAE)) with its prior and approximate posterior being SDEs. This example can be run via\n```shell script\npython -m examples.latent_sde --train-dir <TRAIN_DIR>\n```\nThe program outputs figures to the path specified by `<TRAIN_DIR>`.\nTraining should stabilize after 500 iterations with the default hyperparameters.\n\n### Neural SDEs as GANs\n[`examples/sde_gan.py`](examples/sde_gan.py) learns an SDE as a GAN, as in [\\[2\\]](https://arxiv.org/abs/2102.03657), [\\[3\\]](https://arxiv.org/abs/2105.13493). The example trains an SDE as the generator of a GAN, whilst using a [neural CDE](https://github.com/patrick-kidger/NeuralCDE) [\\[4\\]](https://arxiv.org/abs/2005.08926) as the discriminator. This example can be run via\n\n```shell script\npython -m examples.sde_gan\n```\n\n## Citation\n\nIf you found this codebase useful in your research, please consider citing either or both of:\n\n```\n@article{li2020scalable,\n  title={Scalable gradients for stochastic differential equations},\n  author={Li, Xuechen and Wong, Ting-Kam Leonard and Chen, Ricky T. Q. and Duvenaud, David},\n  journal={International Conference on Artificial Intelligence and Statistics},\n  year={2020}\n}\n```\n\n```\n@article{kidger2021neuralsde,\n  title={Neural {SDE}s as {I}nfinite-{D}imensional {GAN}s},\n  author={Kidger, Patrick and Foster, James and Li, Xuechen and Oberhauser, Harald and Lyons, Terry},\n  journal={International Conference on Machine Learning},\n  year={2021}\n}\n```\n\n## References\n\n\\[1\\] Xuechen Li, Ting-Kam Leonard Wong, Ricky T. Q. Chen, David Duvenaud. \"Scalable Gradients for Stochastic Differential Equations\". *International Conference on Artificial Intelligence and Statistics.* 2020. [[arXiv]](https://arxiv.org/pdf/2001.01328.pdf)\n\n\\[2\\] Patrick Kidger, James Foster, Xuechen Li, Harald Oberhauser, Terry Lyons. \"Neural SDEs as Infinite-Dimensional GANs\". *International Conference on Machine Learning* 2021. [[arXiv]](https://arxiv.org/abs/2102.03657)\n\n\\[3\\] Patrick Kidger, James Foster, Xuechen Li, Terry Lyons. \"Efficient and Accurate Gradients for Neural SDEs\". 2021. [[arXiv]](https://arxiv.org/abs/2105.13493)\n\n\\[4\\] Patrick Kidger, James Morrill, James Foster, Terry Lyons, \"Neural Controlled Differential Equations for Irregular Time Series\". *Neural Information Processing Systems* 2020. [[arXiv]](https://arxiv.org/abs/2005.08926)\n\n---\nThis is a research project, not an official Google product. \n",
    "bugtrack_url": null,
    "license": "",
    "summary": "SDE solvers and stochastic adjoint sensitivity analysis in PyTorch.",
    "version": "0.2.6",
    "project_urls": {
        "Homepage": "https://github.com/google-research/torchsde"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "dd1fb67ebd7e19ffe259f05d3cf4547326725c3113d640c277030be3e9998d6f",
                "md5": "ba08fc9429b21ce19b84efad7bbc6238",
                "sha256": "19bf7ff02eec7e8e46ba1cdb4aa0f9db1c51d492524a16975234b467f7fc463b"
            },
            "downloads": -1,
            "filename": "torchsde-0.2.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "ba08fc9429b21ce19b84efad7bbc6238",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 61232,
            "upload_time": "2023-09-26T21:52:19",
            "upload_time_iso_8601": "2023-09-26T21:52:19.274118Z",
            "url": "https://files.pythonhosted.org/packages/dd/1f/b67ebd7e19ffe259f05d3cf4547326725c3113d640c277030be3e9998d6f/torchsde-0.2.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "71a5ae18ee6de023b3a5462122a43a4c9812c11d275cc585a3d08bf24945c02a",
                "md5": "6bfa639eaa8a814b4d738d13a4df6481",
                "sha256": "81d074d3504f9d190f1694fb526395afbe4608ee43a88adb1262a639e5b4778b"
            },
            "downloads": -1,
            "filename": "torchsde-0.2.6.tar.gz",
            "has_sig": false,
            "md5_digest": "6bfa639eaa8a814b4d738d13a4df6481",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 48840,
            "upload_time": "2023-09-26T21:52:20",
            "upload_time_iso_8601": "2023-09-26T21:52:20.614453Z",
            "url": "https://files.pythonhosted.org/packages/71/a5/ae18ee6de023b3a5462122a43a4c9812c11d275cc585a3d08bf24945c02a/torchsde-0.2.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-09-26 21:52:20",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "google-research",
    "github_project": "torchsde",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "torchsde"
}
        
Elapsed time: 0.12932s