Name | gprob JSON |
Version |
1.0.1
JSON |
| download |
home_page | None |
Summary | Probabilistic programming with arrays of Gaussian variables. |
upload_time | 2024-10-16 14:36:19 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.9 |
license | MIT License Copyright (c) 2024 Sergey A. Fedorov Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. |
keywords |
gaussian distribution
noise
random variables
stochastic processes
gaussian processes
probabilistic programming
python
numpy
scipy
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# gprob
gprob is a python package that implements a probabilistic programming language for Gaussian random variables with exact conditioning. It is built around the idea that arrays of Gaussian random variables can be handled in the same way as numerical numpy arrays.
To give a flavor of it, the first example shows a few operations on scalar variables and conditioning
```python
>>> import gprob as gp
>>> x = gp.normal()
>>> y = gp.normal()
>>> z = x + 0.2 * y + 3
>>> z
Normal(mean=3, var=1.04)
>>> z | {y - 0.5 * x: 1} # conditioning
Normal(mean=2.76, var=0.968)
```
The second example is the construction of a random walk of a Brownian particle observed in the beginning at x=0 and midway through its motion at x=1,
```python
>>> nstep = 5 * 10**3
>>> dx = gp.normal(0, 1/nstep, size=(nstep,))
>>> x = gp.cumsum(dx, 0) # unconditional particle positions
>>> xc = x | {x[nstep//2]: 1} # positions conditioned on x[nstep//2] == 1
>>> samples = xc.sample(10**2) # sampling 100 trajectories
```
```python
>>> import matplotlib.pyplot as plt
>>> plt.plot(samples.T, alpha=0.1, color='gray')
>>> plt.show()
```
![brownian readme](./assets/brownian_readme.png)
## Requirements
* python >= 3.9
* [numpy](https://numpy.org/) >= 1.25
* [scipy](https://scipy.org/)
## Installation
The package can be installed from PyPI,
```
pip install gprob
```
or from this repository (to get the latest version),
```
pip install git+https://github.com/SAFedorov/gprob.git
```
## Getting started
Have a look at the notebooks in the [examples](examples) folder, starting from the tutorials on
1. [Random variables](examples/1-random-variables.ipynb)
2. [Array operations](examples/2-array-operations.ipynb)
3. [Sparse arrays](examples/3-sparse-arrays.ipynb)
4. [Likelihood fitting](examples/4-likelihood-fitting-fisher.ipynb)
roughly in this order.
## How it works
There is a supplementary [note](https://safedorov.github.io/gprob-note/) that presents some of the underying theory, especially the theory of inference.
## How many variables it can handle
General multivariate Gaussian distributions of *n* variables require memory quadratic in *n* for their storage, and computational time cubic in *n* for their exact conditioning. My laptop can typically handle arrays whose sizes count in thousands.
If the Gaussian variables are such that their joint distribution is a direct product, they can be packed into sparse arrays. For those, memory and computational requirements grow linearly with the number of independent distributions, and the total number of variables can be larger.
## Acknowledgements
gprob was inspired by (but works differently from) [GaussianInfer](https://github.com/damast93/GaussianInfer). See the corresponding paper,
D. Stein and S. Staton, "Compositional Semantics for Probabilistic Programs with Exact Conditioning," 2021 36th Annual ACM/IEEE Symposium on Logic in Computer Science (LICS), Rome, Italy, 2021, pp. 1-13, doi: 10.1109/LICS52264.2021.9470552 .
gprob uses the subscript parser from [opt-einsum](https://github.com/dgasmith/opt_einsum). Some linearization tricks and choices of tooling follow [autograd](https://github.com/HIPS/autograd).
Raw data
{
"_id": null,
"home_page": null,
"name": "gprob",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "Gaussian distribution, Noise, Random variables, Stochastic processes, Gaussian processes, Probabilistic programming, Python, Numpy, Scipy",
"author": null,
"author_email": "Sergey Fedorov <fedorov.s.a@outlook.com>",
"download_url": "https://files.pythonhosted.org/packages/e2/29/9443ff2a5d2f1aad67b2827b3028c9cd088c44edb42e640417a19d9fc567/gprob-1.0.1.tar.gz",
"platform": null,
"description": "# gprob\r\ngprob is a python package that implements a probabilistic programming language for Gaussian random variables with exact conditioning. It is built around the idea that arrays of Gaussian random variables can be handled in the same way as numerical numpy arrays.\r\n\r\nTo give a flavor of it, the first example shows a few operations on scalar variables and conditioning\r\n```python\r\n>>> import gprob as gp\r\n>>> x = gp.normal()\r\n>>> y = gp.normal()\r\n>>> z = x + 0.2 * y + 3\r\n>>> z\r\nNormal(mean=3, var=1.04)\r\n>>> z | {y - 0.5 * x: 1} # conditioning\r\nNormal(mean=2.76, var=0.968)\r\n```\r\n\r\nThe second example is the construction of a random walk of a Brownian particle observed in the beginning at x=0 and midway through its motion at x=1,\r\n```python\r\n>>> nstep = 5 * 10**3\r\n>>> dx = gp.normal(0, 1/nstep, size=(nstep,))\r\n>>> x = gp.cumsum(dx, 0) # unconditional particle positions\r\n>>> xc = x | {x[nstep//2]: 1} # positions conditioned on x[nstep//2] == 1\r\n>>> samples = xc.sample(10**2) # sampling 100 trajectories\r\n```\r\n```python\r\n>>> import matplotlib.pyplot as plt\r\n>>> plt.plot(samples.T, alpha=0.1, color='gray')\r\n>>> plt.show()\r\n```\r\n![brownian readme](./assets/brownian_readme.png)\r\n\r\n## Requirements\r\n* python >= 3.9\r\n* [numpy](https://numpy.org/) >= 1.25\r\n* [scipy](https://scipy.org/)\r\n\r\n## Installation\r\nThe package can be installed from PyPI,\r\n```\r\npip install gprob\r\n```\r\n\r\nor from this repository (to get the latest version),\r\n\r\n```\r\npip install git+https://github.com/SAFedorov/gprob.git\r\n```\r\n\r\n## Getting started\r\nHave a look at the notebooks in the [examples](examples) folder, starting from the tutorials on\r\n1. [Random variables](examples/1-random-variables.ipynb)\r\n2. [Array operations](examples/2-array-operations.ipynb)\r\n3. [Sparse arrays](examples/3-sparse-arrays.ipynb)\r\n4. [Likelihood fitting](examples/4-likelihood-fitting-fisher.ipynb)\r\n\r\nroughly in this order.\r\n\r\n## How it works\r\nThere is a supplementary [note](https://safedorov.github.io/gprob-note/) that presents some of the underying theory, especially the theory of inference.\r\n\r\n## How many variables it can handle\r\nGeneral multivariate Gaussian distributions of *n* variables require memory quadratic in *n* for their storage, and computational time cubic in *n* for their exact conditioning. My laptop can typically handle arrays whose sizes count in thousands.\r\n\r\nIf the Gaussian variables are such that their joint distribution is a direct product, they can be packed into sparse arrays. For those, memory and computational requirements grow linearly with the number of independent distributions, and the total number of variables can be larger. \r\n\r\n## Acknowledgements\r\ngprob was inspired by (but works differently from) [GaussianInfer](https://github.com/damast93/GaussianInfer). See the corresponding paper,\r\n\r\nD. Stein and S. Staton, \"Compositional Semantics for Probabilistic Programs with Exact Conditioning,\" 2021 36th Annual ACM/IEEE Symposium on Logic in Computer Science (LICS), Rome, Italy, 2021, pp. 1-13, doi: 10.1109/LICS52264.2021.9470552 .\r\n\r\ngprob uses the subscript parser from [opt-einsum](https://github.com/dgasmith/opt_einsum). Some linearization tricks and choices of tooling follow [autograd](https://github.com/HIPS/autograd).\r\n\r\n",
"bugtrack_url": null,
"license": "MIT License Copyright (c) 2024 Sergey A. Fedorov Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ",
"summary": "Probabilistic programming with arrays of Gaussian variables.",
"version": "1.0.1",
"project_urls": {
"Source": "https://github.com/SAFedorov/gprob"
},
"split_keywords": [
"gaussian distribution",
" noise",
" random variables",
" stochastic processes",
" gaussian processes",
" probabilistic programming",
" python",
" numpy",
" scipy"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "dd2202d26d384af7de4c321c6422544d322c0cd2f2e87b7b1a7a2ab9ce2444b6",
"md5": "bb8a664acf6ffcb56dfd4b8b6d2f91b9",
"sha256": "168fdf76ad5cf952d9e6b477685c489fa4df24eb331593c8998788f1d2b2d773"
},
"downloads": -1,
"filename": "gprob-1.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "bb8a664acf6ffcb56dfd4b8b6d2f91b9",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 39542,
"upload_time": "2024-10-16T14:36:18",
"upload_time_iso_8601": "2024-10-16T14:36:18.070545Z",
"url": "https://files.pythonhosted.org/packages/dd/22/02d26d384af7de4c321c6422544d322c0cd2f2e87b7b1a7a2ab9ce2444b6/gprob-1.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "e2299443ff2a5d2f1aad67b2827b3028c9cd088c44edb42e640417a19d9fc567",
"md5": "9aed3fa8229b1b60259ddb3fa74120d4",
"sha256": "0dcd433ee188e41f523fbb00504b72bb2c4c9b1f3abea65909c67c1875fa6a87"
},
"downloads": -1,
"filename": "gprob-1.0.1.tar.gz",
"has_sig": false,
"md5_digest": "9aed3fa8229b1b60259ddb3fa74120d4",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 77878,
"upload_time": "2024-10-16T14:36:19",
"upload_time_iso_8601": "2024-10-16T14:36:19.505319Z",
"url": "https://files.pythonhosted.org/packages/e2/29/9443ff2a5d2f1aad67b2827b3028c9cd088c44edb42e640417a19d9fc567/gprob-1.0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-16 14:36:19",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "SAFedorov",
"github_project": "gprob",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "gprob"
}