blackjax-nightly


Nameblackjax-nightly JSON
Version 0.9.6.post131 PyPI version JSON
download
home_page
SummaryFlexible and fast sampling in Python
upload_time2023-02-01 15:17:21
maintainer
docs_urlNone
author
requires_python>=3.7
licenseApache License 2.0
keywords probability machine learning statistics mcmc sampling
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # BlackJAX
![CI](https://github.com/blackjax-devs/blackjax/workflows/Run%20tests/badge.svg?branch=main)
[![codecov](https://codecov.io/gh/blackjax-devs/blackjax/branch/main/graph/badge.svg)](https://codecov.io/gh/blackjax-devs/blackjax)


## What is BlackJAX?

BlackJAX is a library of samplers for [JAX](https://github.com/google/jax) that
works on CPU as well as GPU.

It is *not* a probabilistic programming library. However it integrates really
well with PPLs as long as they can provide a (potentially unnormalized)
log-probability density function compatible with JAX.

## Who should use BlackJAX?

BlackJAX should appeal to those who:
- Have a logpdf and just need a sampler;
- Need more than a general-purpose sampler;
- Want to sample on GPU;
- Want to build upon robust elementary blocks for their research;
- Are building a probabilistic programming language;
- Want to learn how sampling algorithms work.

## Quickstart

### Installation

You can install BlackJAX using `pip`:

```bash
pip install blackjax
```

or via conda-forge:

```bash
conda install -c conda-forge blackjax
```

Nightly builds (bleeding edge) of Blackjax can also be installed using `pip`:

```bash
pip install blackjax-nightly
```

BlackJAX is written in pure Python but depends on XLA via JAX. By default, the
version of JAX that will be installed along with BlackJAX will make your code
run on CPU only. **If you want to use BlackJAX on GPU/TPU** we recommend you follow
[these instructions](https://github.com/google/jax#installation) to install JAX
with the relevant hardware acceleration support.

### Example

Let us look at a simple self-contained example sampling with NUTS:

```python
import jax
import jax.numpy as jnp
import jax.scipy.stats as stats
import numpy as np

import blackjax

observed = np.random.normal(10, 20, size=1_000)
def logdensity_fn(x):
  logpdf = stats.norm.logpdf(observed, x["loc"], x["scale"])
  return jnp.sum(logpdf)

# Build the kernel
step_size = 1e-3
inverse_mass_matrix = jnp.array([1., 1.])
nuts = blackjax.nuts(logdensity_fn, step_size, inverse_mass_matrix)

# Initialize the state
initial_position = {"loc": 1., "scale": 2.}
state = nuts.init(initial_position)

# Iterate
rng_key = jax.random.PRNGKey(0)
for _ in range(100):
    _, rng_key = jax.random.split(rng_key)
    state, _ = nuts.step(rng_key, state)
```

See [the documentation](https://blackjax-devs.github.io/blackjax/index.html) for more examples of how to use the library: how to write inference loops for one or several chains, how to use the Stan warmup, etc.

## Philosophy

### What is BlackJAX?

BlackJAX bridges the gap between "one liner" frameworks and modular, customizable
libraries.

Users can import the library and interact with robust, well-tested and performant
samplers with a few lines of code. These samplers are aimed at PPL developers,
or people who have a logpdf and just need a sampler that works.

But the true strength of BlackJAX lies in its internals and how they can be used
to experiment quickly on existing or new sampling schemes. This lower level
exposes the building blocks of inference algorithms: integrators, proposal,
momentum generators, etc and makes it easy to combine them to build new
algorithms. It provides an opportunity to accelerate research on sampling
algorithms by providing robust, performant and reusable code.

### Why BlackJAX?

Sampling algorithms are too often integrated into PPLs and not decoupled from
the rest of the framework, making them hard to use for people who do not need
the modeling language to build their logpdf. Their implementation is most of
the time monolithic and it is impossible to reuse parts of the algorithm to
build custom kernels. BlackJAX solves both problems.

### How does it work?

BlackJAX allows to build arbitrarily complex algorithms because it is built
around a very general pattern. Everything that takes a state and returns a state
is a transition kernel, and is implemented as:

```python
new_state, info =  kernel(rng_key, state)
```

kernels are stateless functions and all follow the same API; state and
information related to the transition are returned separately. They can thus be
easily composed and exchanged. We specialize these kernels by closure instead of
passing parameters.

## Contributions

### What contributions?

We value the following contributions:
- Bug fixes
- Documentation
- High-level sampling algorithms from any family of algorithms: random walk,
  hamiltonian monte carlo, sequential monte carlo, variational inference,
  inference compilation, etc.
- New building blocks, e.g. new metrics for HMC, integrators, etc.

### How to contribute?

1. Run `pip install -r requirements.txt` to install all the dev
   dependencies.
2. Run `pre-commit run --all-files` and `make test` before pushing on the repo; CI should pass if
   these pass locally.

## Citing Blackjax

To cite this repository:

```
@software{blackjax2020github,
  author = {Lao, Junpeng and Louf, R\'emi},
  title = {{B}lackjax: A sampling library for {JAX}},
  url = {http://github.com/blackjax-devs/blackjax},
  version = {<insert current release tag>},
  year = {2020},
}
```
In the above bibtex entry, names are in alphabetical order, the version number should be the last tag on the `main` branch.

## Acknowledgements

Some details of the NUTS implementation were largely inspired by
[Numpyro](https://github.com/pyro-ppl/numpyro)'s.

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "blackjax-nightly",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "",
    "keywords": "probability,machine learning,statistics,mcmc,sampling",
    "author": "",
    "author_email": "The Blackjax team <remi@thetypicalset.com>",
    "download_url": "https://files.pythonhosted.org/packages/27/a9/ddf7f9449efc0cb7ecefdf3ed66b610b5f51502fb0a616c1ebc2d2cbb669/blackjax-nightly-0.9.6.post131.tar.gz",
    "platform": "Linux",
    "description": "# BlackJAX\n![CI](https://github.com/blackjax-devs/blackjax/workflows/Run%20tests/badge.svg?branch=main)\n[![codecov](https://codecov.io/gh/blackjax-devs/blackjax/branch/main/graph/badge.svg)](https://codecov.io/gh/blackjax-devs/blackjax)\n\n\n## What is BlackJAX?\n\nBlackJAX is a library of samplers for [JAX](https://github.com/google/jax) that\nworks on CPU as well as GPU.\n\nIt is *not* a probabilistic programming library. However it integrates really\nwell with PPLs as long as they can provide a (potentially unnormalized)\nlog-probability density function compatible with JAX.\n\n## Who should use BlackJAX?\n\nBlackJAX should appeal to those who:\n- Have a logpdf and just need a sampler;\n- Need more than a general-purpose sampler;\n- Want to sample on GPU;\n- Want to build upon robust elementary blocks for their research;\n- Are building a probabilistic programming language;\n- Want to learn how sampling algorithms work.\n\n## Quickstart\n\n### Installation\n\nYou can install BlackJAX using `pip`:\n\n```bash\npip install blackjax\n```\n\nor via conda-forge:\n\n```bash\nconda install -c conda-forge blackjax\n```\n\nNightly builds (bleeding edge) of Blackjax can also be installed using `pip`:\n\n```bash\npip install blackjax-nightly\n```\n\nBlackJAX is written in pure Python but depends on XLA via JAX. By default, the\nversion of JAX that will be installed along with BlackJAX will make your code\nrun on CPU only. **If you want to use BlackJAX on GPU/TPU** we recommend you follow\n[these instructions](https://github.com/google/jax#installation) to install JAX\nwith the relevant hardware acceleration support.\n\n### Example\n\nLet us look at a simple self-contained example sampling with NUTS:\n\n```python\nimport jax\nimport jax.numpy as jnp\nimport jax.scipy.stats as stats\nimport numpy as np\n\nimport blackjax\n\nobserved = np.random.normal(10, 20, size=1_000)\ndef logdensity_fn(x):\n  logpdf = stats.norm.logpdf(observed, x[\"loc\"], x[\"scale\"])\n  return jnp.sum(logpdf)\n\n# Build the kernel\nstep_size = 1e-3\ninverse_mass_matrix = jnp.array([1., 1.])\nnuts = blackjax.nuts(logdensity_fn, step_size, inverse_mass_matrix)\n\n# Initialize the state\ninitial_position = {\"loc\": 1., \"scale\": 2.}\nstate = nuts.init(initial_position)\n\n# Iterate\nrng_key = jax.random.PRNGKey(0)\nfor _ in range(100):\n    _, rng_key = jax.random.split(rng_key)\n    state, _ = nuts.step(rng_key, state)\n```\n\nSee [the documentation](https://blackjax-devs.github.io/blackjax/index.html) for more examples of how to use the library: how to write inference loops for one or several chains, how to use the Stan warmup, etc.\n\n## Philosophy\n\n### What is BlackJAX?\n\nBlackJAX bridges the gap between \"one liner\" frameworks and modular, customizable\nlibraries.\n\nUsers can import the library and interact with robust, well-tested and performant\nsamplers with a few lines of code. These samplers are aimed at PPL developers,\nor people who have a logpdf and just need a sampler that works.\n\nBut the true strength of BlackJAX lies in its internals and how they can be used\nto experiment quickly on existing or new sampling schemes. This lower level\nexposes the building blocks of inference algorithms: integrators, proposal,\nmomentum generators, etc and makes it easy to combine them to build new\nalgorithms. It provides an opportunity to accelerate research on sampling\nalgorithms by providing robust, performant and reusable code.\n\n### Why BlackJAX?\n\nSampling algorithms are too often integrated into PPLs and not decoupled from\nthe rest of the framework, making them hard to use for people who do not need\nthe modeling language to build their logpdf. Their implementation is most of\nthe time monolithic and it is impossible to reuse parts of the algorithm to\nbuild custom kernels. BlackJAX solves both problems.\n\n### How does it work?\n\nBlackJAX allows to build arbitrarily complex algorithms because it is built\naround a very general pattern. Everything that takes a state and returns a state\nis a transition kernel, and is implemented as:\n\n```python\nnew_state, info =  kernel(rng_key, state)\n```\n\nkernels are stateless functions and all follow the same API; state and\ninformation related to the transition are returned separately. They can thus be\neasily composed and exchanged. We specialize these kernels by closure instead of\npassing parameters.\n\n## Contributions\n\n### What contributions?\n\nWe value the following contributions:\n- Bug fixes\n- Documentation\n- High-level sampling algorithms from any family of algorithms: random walk,\n  hamiltonian monte carlo, sequential monte carlo, variational inference,\n  inference compilation, etc.\n- New building blocks, e.g. new metrics for HMC, integrators, etc.\n\n### How to contribute?\n\n1. Run `pip install -r requirements.txt` to install all the dev\n   dependencies.\n2. Run `pre-commit run --all-files` and `make test` before pushing on the repo; CI should pass if\n   these pass locally.\n\n## Citing Blackjax\n\nTo cite this repository:\n\n```\n@software{blackjax2020github,\n  author = {Lao, Junpeng and Louf, R\\'emi},\n  title = {{B}lackjax: A sampling library for {JAX}},\n  url = {http://github.com/blackjax-devs/blackjax},\n  version = {<insert current release tag>},\n  year = {2020},\n}\n```\nIn the above bibtex entry, names are in alphabetical order, the version number should be the last tag on the `main` branch.\n\n## Acknowledgements\n\nSome details of the NUTS implementation were largely inspired by\n[Numpyro](https://github.com/pyro-ppl/numpyro)'s.\n",
    "bugtrack_url": null,
    "license": "Apache License 2.0",
    "summary": "Flexible and fast sampling in Python",
    "version": "0.9.6.post131",
    "split_keywords": [
        "probability",
        "machine learning",
        "statistics",
        "mcmc",
        "sampling"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "97c6b00956ff5d39043992325f81219de1fafc66aa56831747f80683abb3fc6b",
                "md5": "0cb245786ecae798b3f4eb050356ddd9",
                "sha256": "38cbcd23d6804ddd7a0cd58f13cf44c46fc08de3b11a60cfb8d8a59907fa25d2"
            },
            "downloads": -1,
            "filename": "blackjax_nightly-0.9.6.post131-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "0cb245786ecae798b3f4eb050356ddd9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 320025,
            "upload_time": "2023-02-01T15:17:20",
            "upload_time_iso_8601": "2023-02-01T15:17:20.340680Z",
            "url": "https://files.pythonhosted.org/packages/97/c6/b00956ff5d39043992325f81219de1fafc66aa56831747f80683abb3fc6b/blackjax_nightly-0.9.6.post131-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "27a9ddf7f9449efc0cb7ecefdf3ed66b610b5f51502fb0a616c1ebc2d2cbb669",
                "md5": "0ebd6ea698b72412f330a9b1363f6896",
                "sha256": "76e596011c5effa248f14f99a8ab099f8410cf2922fb49a8877d99d4261acaff"
            },
            "downloads": -1,
            "filename": "blackjax-nightly-0.9.6.post131.tar.gz",
            "has_sig": false,
            "md5_digest": "0ebd6ea698b72412f330a9b1363f6896",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 300346,
            "upload_time": "2023-02-01T15:17:21",
            "upload_time_iso_8601": "2023-02-01T15:17:21.941067Z",
            "url": "https://files.pythonhosted.org/packages/27/a9/ddf7f9449efc0cb7ecefdf3ed66b610b5f51502fb0a616c1ebc2d2cbb669/blackjax-nightly-0.9.6.post131.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-02-01 15:17:21",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "lcname": "blackjax-nightly"
}
        
Elapsed time: 0.09186s