pgmax


Namepgmax JSON
Version 0.6.1 PyPI version JSON
download
home_page
SummaryLoopy belief propagation for factor graphs on discrete variables in JAX
upload_time2023-08-11 01:52:38
maintainer
docs_urlNone
authorDeepMind
requires_python
licenseApache 2.0
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![continuous-integration](https://github.com/deepmind/PGMax/actions/workflows/ci.yaml/badge.svg)](https://github.com/deepmind/PGMax/actions/workflows/ci.yaml)
[![PyPI version](https://badge.fury.io/py/pgmax.svg)](https://badge.fury.io/py/pgmax)
[![Documentation Status](https://readthedocs.org/projects/pgmax/badge/?version=latest)](https://pgmax.readthedocs.io/en/latest/?badge=latest)

# PGMax

PGMax implements general [factor graphs](https://en.wikipedia.org/wiki/Factor_graph)
for discrete probabilistic graphical models (PGMs), and
hardware-accelerated differentiable [loopy belief propagation (LBP)](https://en.wikipedia.org/wiki/Belief_propagation)
in [JAX](https://jax.readthedocs.io/en/latest/).

- **General factor graphs**: PGMax supports easy specification of general
factor graphs with potentially complicated topology, factor definitions,
and discrete variables with a varying number of states.
- **LBP in JAX**: PGMax generates pure JAX functions implementing LBP for a
given factor graph. The generated pure JAX functions run on modern accelerators
(GPU/TPU), work with JAX transformations
(e.g. `vmap` for processing batches of models/samples,
`grad` for differentiating through the LBP iterative process),
and can be easily used as part of a larger end-to-end differentiable system.

See our [companion paper](https://arxiv.org/abs/2202.04110) for more details.

PGMax is under active development. APIs may change without notice,
and expect rough edges!

[**Installation**](#installation)
| [**Getting started**](#getting-started)

## Installation

### Install from PyPI
```
pip install pgmax
```

### Install latest version from GitHub
```
pip install git+https://github.com/deepmind/PGMax.git
```

### Developer
While you can install PGMax in your standard python environment,
we *strongly* recommend using a
[Python virtual environment](https://docs.python.org/3/tutorial/venv.html)
to manage your dependencies. This should help to avoid version conflicts and
just generally make the installation process easier.

```
git clone https://github.com/deepmind/PGMax.git
cd PGMax
python3 -m venv pgmax_env
source pgmax_env/bin/activate
pip install --upgrade pip setuptools
pip install -r requirements.txt
python3 setup.py develop
```

### Install on GPU

By default the above commands install JAX for CPU. If you have access to a GPU, 
follow the official instructions [here](https://github.com/google/jax#pip-installation-gpu-cuda)
to install JAX for GPU.

## Getting Started


Here are a few self-contained Colab notebooks to help you get started on using PGMax:

- [Tutorial on basic PGMax usage](https://colab.research.google.com/github/deepmind/PGMax/blob/master/examples/rbm.ipynb)
- [LBP inference on Ising model](https://colab.research.google.com/github/deepmind/PGMax/blob/master/examples/ising_model.ipynb)
- [Implementing max-product LBP](https://colab.research.google.com/github/deepmind/PGMax/blob/master/examples/rcn.ipynb)
for [Recursive Cortical Networks](https://www.science.org/doi/10.1126/science.aag2612)
- [End-to-end differentiable LBP for gradient-based PGM training](https://colab.research.google.com/github/deepmind/PGMax/blob/master/examples/gmrf.ipynb)
- [2D binary deconvolution](https://colab.research.google.com/github/deepmind/PGMax/blob/master/examples/pmp_binary_deconvolution.ipynb)
- [Alternative inference with Smooth Dual LP-MAP](https://colab.research.google.com/github/deepmind/PGMax/blob/master/examples/sdlp_examples.ipynb)

## Citing PGMax

Please consider citing our [companion paper](https://arxiv.org/abs/2202.04110)
```
@article{zhou2022pgmax,
  author = {Zhou, Guangyao and Dedieu, Antoine and Kumar, Nishanth and L{\'a}zaro-Gredilla, Miguel and Kushagra, Shrinu and George, Dileep},
  title = {{PGMax: Factor Graphs for Discrete Probabilistic Graphical Models and Loopy Belief Propagation in JAX}},
  journal = {arXiv preprint arXiv:2202.04110},
  year={2022}
}
```
and using the [DeepMind JAX Ecosystem citation](https://github.com/deepmind/jax/blob/main/deepmind2020jax.txt) if you use PGMax in your work.

## Note

This is not an officially supported Google product.

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "pgmax",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "DeepMind",
    "author_email": "pgmax-dev@google.com",
    "download_url": "https://files.pythonhosted.org/packages/c2/09/249683576c7b775e48116bb6eb5fe72437ea5dcbc6ccf4cfdcab8be7bf7a/pgmax-0.6.1.tar.gz",
    "platform": null,
    "description": "[![continuous-integration](https://github.com/deepmind/PGMax/actions/workflows/ci.yaml/badge.svg)](https://github.com/deepmind/PGMax/actions/workflows/ci.yaml)\n[![PyPI version](https://badge.fury.io/py/pgmax.svg)](https://badge.fury.io/py/pgmax)\n[![Documentation Status](https://readthedocs.org/projects/pgmax/badge/?version=latest)](https://pgmax.readthedocs.io/en/latest/?badge=latest)\n\n# PGMax\n\nPGMax implements general [factor graphs](https://en.wikipedia.org/wiki/Factor_graph)\nfor discrete probabilistic graphical models (PGMs), and\nhardware-accelerated differentiable [loopy belief propagation (LBP)](https://en.wikipedia.org/wiki/Belief_propagation)\nin [JAX](https://jax.readthedocs.io/en/latest/).\n\n- **General factor graphs**: PGMax supports easy specification of general\nfactor graphs with potentially complicated topology, factor definitions,\nand discrete variables with a varying number of states.\n- **LBP in JAX**: PGMax generates pure JAX functions implementing LBP for a\ngiven factor graph. The generated pure JAX functions run on modern accelerators\n(GPU/TPU), work with JAX transformations\n(e.g. `vmap` for processing batches of models/samples,\n`grad` for differentiating through the LBP iterative process),\nand can be easily used as part of a larger end-to-end differentiable system.\n\nSee our [companion paper](https://arxiv.org/abs/2202.04110) for more details.\n\nPGMax is under active development. APIs may change without notice,\nand expect rough edges!\n\n[**Installation**](#installation)\n| [**Getting started**](#getting-started)\n\n## Installation\n\n### Install from PyPI\n```\npip install pgmax\n```\n\n### Install latest version from GitHub\n```\npip install git+https://github.com/deepmind/PGMax.git\n```\n\n### Developer\nWhile you can install PGMax in your standard python environment,\nwe *strongly* recommend using a\n[Python virtual environment](https://docs.python.org/3/tutorial/venv.html)\nto manage your dependencies. This should help to avoid version conflicts and\njust generally make the installation process easier.\n\n```\ngit clone https://github.com/deepmind/PGMax.git\ncd PGMax\npython3 -m venv pgmax_env\nsource pgmax_env/bin/activate\npip install --upgrade pip setuptools\npip install -r requirements.txt\npython3 setup.py develop\n```\n\n### Install on GPU\n\nBy default the above commands install JAX for CPU. If you have access to a GPU, \nfollow the official instructions [here](https://github.com/google/jax#pip-installation-gpu-cuda)\nto install JAX for GPU.\n\n## Getting Started\n\n\nHere are a few self-contained Colab notebooks to help you get started on using PGMax:\n\n- [Tutorial on basic PGMax usage](https://colab.research.google.com/github/deepmind/PGMax/blob/master/examples/rbm.ipynb)\n- [LBP inference on Ising model](https://colab.research.google.com/github/deepmind/PGMax/blob/master/examples/ising_model.ipynb)\n- [Implementing max-product LBP](https://colab.research.google.com/github/deepmind/PGMax/blob/master/examples/rcn.ipynb)\nfor [Recursive Cortical Networks](https://www.science.org/doi/10.1126/science.aag2612)\n- [End-to-end differentiable LBP for gradient-based PGM training](https://colab.research.google.com/github/deepmind/PGMax/blob/master/examples/gmrf.ipynb)\n- [2D binary deconvolution](https://colab.research.google.com/github/deepmind/PGMax/blob/master/examples/pmp_binary_deconvolution.ipynb)\n- [Alternative inference with Smooth Dual LP-MAP](https://colab.research.google.com/github/deepmind/PGMax/blob/master/examples/sdlp_examples.ipynb)\n\n## Citing PGMax\n\nPlease consider citing our [companion paper](https://arxiv.org/abs/2202.04110)\n```\n@article{zhou2022pgmax,\n  author = {Zhou, Guangyao and Dedieu, Antoine and Kumar, Nishanth and L{\\'a}zaro-Gredilla, Miguel and Kushagra, Shrinu and George, Dileep},\n  title = {{PGMax: Factor Graphs for Discrete Probabilistic Graphical Models and Loopy Belief Propagation in JAX}},\n  journal = {arXiv preprint arXiv:2202.04110},\n  year={2022}\n}\n```\nand using the [DeepMind JAX Ecosystem citation](https://github.com/deepmind/jax/blob/main/deepmind2020jax.txt) if you use PGMax in your work.\n\n## Note\n\nThis is not an officially supported Google product.\n",
    "bugtrack_url": null,
    "license": "Apache 2.0",
    "summary": "Loopy belief propagation for factor graphs on discrete variables in JAX",
    "version": "0.6.1",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "61c0e8825f2fc90427a0ac39f496d414d2806782cd9ad4a47fe711f15c8b5663",
                "md5": "e5ea69372e9fe2a329c35e2e9d3804e0",
                "sha256": "c5f781bdfb1ad861905b23ce3bfe40edf23a1afece5baff5e123292397b30bb0"
            },
            "downloads": -1,
            "filename": "pgmax-0.6.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e5ea69372e9fe2a329c35e2e9d3804e0",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 77460,
            "upload_time": "2023-08-11T01:52:36",
            "upload_time_iso_8601": "2023-08-11T01:52:36.580233Z",
            "url": "https://files.pythonhosted.org/packages/61/c0/e8825f2fc90427a0ac39f496d414d2806782cd9ad4a47fe711f15c8b5663/pgmax-0.6.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c209249683576c7b775e48116bb6eb5fe72437ea5dcbc6ccf4cfdcab8be7bf7a",
                "md5": "30926e215fc242295ed5dedda770aadb",
                "sha256": "034b676d3a1073c9aaf0c5be0a27309e93df8e651c6b12fd2eb7d4b286e96fe6"
            },
            "downloads": -1,
            "filename": "pgmax-0.6.1.tar.gz",
            "has_sig": false,
            "md5_digest": "30926e215fc242295ed5dedda770aadb",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 51266,
            "upload_time": "2023-08-11T01:52:38",
            "upload_time_iso_8601": "2023-08-11T01:52:38.065485Z",
            "url": "https://files.pythonhosted.org/packages/c2/09/249683576c7b775e48116bb6eb5fe72437ea5dcbc6ccf4cfdcab8be7bf7a/pgmax-0.6.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-08-11 01:52:38",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "pgmax"
}
        
Elapsed time: 0.11352s