|Tests Status| |Coverage| |Gitter|
AeMCMC is a Python library that automates the construction of samplers for `Aesara <https://github.com/pymc-devs/aesara>`_ graphs that represent statistical models.
Features
========
This project is currently in an alpha state, but the basic features/objectives are currently as follows:
- Provide utilities that simplify the process of constructing Aesara graphs/functions for posterior and posterior predictive sampling
- Host a wide array of "exact" posterior sampling steps (e.g. Gibbs steps, scale-mixture/decomposition-based conditional samplers, etc.)
- Build a framework for identifying and composing said sampler steps and enumerating the possible samplers for an arbitrary model
Overall, we would like this project to serve as a hub for community-sourced specialized samplers and facilitate their general use.
Getting started
===============
Using AeMCMC, one can construct sampling steps from a graph containing Aesara
`RandomVariable`\s. AeMCMC analyzes the model graph and possibly rewrites it
to find the most suitable sampler.
AeMCMC can recognize closed-form posteriors; for instance the following
Beta-Binomial model amounts to sampling from a Beta distribution:
.. code-block:: python
import aesara
import aemcmc
import aesara.tensor as at
srng = at.random.RandomStream(0)
p_rv = srng.beta(1., 1., name="p")
Y_rv = srng.binomial(10, p_rv, name="Y")
y_vv = Y_rv.clone()
y_vv.name = "y"
sampler, initial_values = aemcmc.construct_sampler({Y_rv: y_vv}, srng)
p_posterior_step = sampler.sample_steps[p_rv]
aesara.dprint(p_posterior_step)
# beta_rv{0, (0, 0), floatX, False}.1 [id A]
# |RandomGeneratorSharedVariable(<Generator(PCG64) at 0x7F77B2831200>) [id B]
# |TensorConstant{[]} [id C]
# |TensorConstant{11} [id D]
# |Elemwise{add,no_inplace} [id E]
# | |TensorConstant{1.0} [id F]
# | |y [id G]
# |Elemwise{sub,no_inplace} [id H]
# |Elemwise{add,no_inplace} [id I]
# | |TensorConstant{1.0} [id F]
# | |TensorConstant{10} [id J]
# |y [id G]
sample_fn = aesara.function([y_vv], p_posterior_step)
AeMCMC also contains a database of Gibbs samplers that can be used to sample
some models more efficiently than a general-purpose sampler like NUTS
would:
.. code-block:: python
import aemcmc
import aesara
import aesara.tensor as at
srng = at.random.RandomStream(0)
X = at.matrix("X")
# Horseshoe prior for `beta_rv`
tau_rv = srng.halfcauchy(0, 1, name="tau")
lmbda_rv = srng.halfcauchy(0, 1, size=X.shape[1], name="lambda")
beta_rv = srng.normal(0, lmbda_rv * tau_rv, size=X.shape[1], name="beta")
a = at.scalar("a")
b = at.scalar("b")
h_rv = srng.gamma(a, b, name="h")
# Negative-binomial regression
eta = X @ beta_rv
p = at.sigmoid(-eta)
Y_rv = srng.nbinom(h_rv, p, name="Y")
y_vv = Y_rv.clone()
y_vv.name = "y"
sampler, initial_values = aemcmc.construct_sampler({Y_rv: y_vv}, srng)
# `sampler.sample_steps` contains the sample step for each random variable
print(sampler.sample_steps[h_rv])
# h_posterior
# `sampler.stages` contains the sampling kernels sorted by scan order
print(sampler.stages)
# {HorseshoeGibbsKernel: [tau, lambda], NBRegressionGibbsKernel: [beta], DispersionGibbsKernel: [h]}
# Build a function that returns new samples
to_sample_rvs = [tau_rv, lmbda_rv, beta_rv, h_rv]
inputs = [a, b, X, y_vv] + [initial_values[rv] for rv in to_sample_rvs]
outputs = [sampler.sample_steps[rv] for rv in to_sample_rvs]
sample_fn = aesara.function(inputs, outputs, updates=sampler.updates)
In case no specialized sampler is found, AeMCMC assigns the NUTS sampler to the
remaining variables. AeMCMC reparametrizes the model automatically to improve
sampling if needed:
.. code-block:: python
import aemcmc
import aesara
import aesara.tensor as at
srng = at.random.RandomStream(0)
mu_rv = srng.normal(0, 1, name="mu")
sigma_rv = srng.halfnormal(0.0, 1.0, name="sigma")
Y_rv = srng.normal(mu_rv, sigma_rv, name="Y")
y_vv = Y_rv.clone()
sampler, initial_values = aemcmc.construct_sampler({Y_rv: y_vv}, srng)
print(sampler.sample_steps.keys())
# dict_keys([sigma, mu])
print(sampler.stages)
# {NUTSKernel: [sigma, mu]}
print(sampler.parameters)
# {NUTSKernel: (step_size, inverse_mass_matrix)}
# Build a function that returns new samples
step_size, inverse_mass_matrix = list(sampler.parameters.values())[0]
inputs = [
initial_values[mu_rv],
initial_values[sigma_rv],
y_vv,
step_size,
inverse_mass_matrix
]
outputs = [sampler.sample_steps[mu_rv], sampler.sample_steps[sigma_rv]]
sample_fn = aesara.function(inputs, outputs, updates=sampler.updates)
Installation
============
The latest release of AeMCMC can be installed from PyPI using ``pip``:
::
pip install aemcmc
Or via conda-forge:
::
conda install -c conda-forge aemcmc
The current development branch of AeMCMC can be installed from GitHub, also using ``pip``:
::
pip install git+https://github.com/aesara-devs/aemcmc
.. |Tests Status| image:: https://github.com/aesara-devs/aemcmc/workflows/Tests/badge.svg
:target: https://github.com/aesara-devs/aemcmc/actions?query=workflow%3ATests
.. |Coverage| image:: https://codecov.io/gh/aesara-devs/aemcmc/branch/main/graph/badge.svg?token=45nKZ7fDG5
:target: https://codecov.io/gh/aesara-devs/aemcmc
.. |Gitter| image:: https://badges.gitter.im/aesara-devs/aesara.svg
:target: https://gitter.im/aesara-devs/aesara?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge
Raw data
{
"_id": null,
"home_page": "http://github.com/aesara-devs/aemcmc",
"name": "aemcmc",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": "",
"keywords": "math,probability,numerical,symbolic,MCMC",
"author": "aesara-devs",
"author_email": "aesara.devs@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/bf/f4/8395d9036351e9928756e14363e68d72c3d837a61a8b88bb76c45634bf15/aemcmc-0.0.7.tar.gz",
"platform": null,
"description": "|Tests Status| |Coverage| |Gitter|\n\nAeMCMC is a Python library that automates the construction of samplers for `Aesara <https://github.com/pymc-devs/aesara>`_ graphs that represent statistical models.\n\nFeatures\n========\n\nThis project is currently in an alpha state, but the basic features/objectives are currently as follows:\n\n- Provide utilities that simplify the process of constructing Aesara graphs/functions for posterior and posterior predictive sampling\n- Host a wide array of \"exact\" posterior sampling steps (e.g. Gibbs steps, scale-mixture/decomposition-based conditional samplers, etc.)\n- Build a framework for identifying and composing said sampler steps and enumerating the possible samplers for an arbitrary model\n\nOverall, we would like this project to serve as a hub for community-sourced specialized samplers and facilitate their general use.\n\nGetting started\n===============\n\nUsing AeMCMC, one can construct sampling steps from a graph containing Aesara\n`RandomVariable`\\s. AeMCMC analyzes the model graph and possibly rewrites it\nto find the most suitable sampler.\n\nAeMCMC can recognize closed-form posteriors; for instance the following\nBeta-Binomial model amounts to sampling from a Beta distribution:\n\n.. code-block:: python\n\n import aesara\n import aemcmc\n import aesara.tensor as at\n\n srng = at.random.RandomStream(0)\n\n p_rv = srng.beta(1., 1., name=\"p\")\n Y_rv = srng.binomial(10, p_rv, name=\"Y\")\n\n y_vv = Y_rv.clone()\n y_vv.name = \"y\"\n\n sampler, initial_values = aemcmc.construct_sampler({Y_rv: y_vv}, srng)\n\n p_posterior_step = sampler.sample_steps[p_rv]\n aesara.dprint(p_posterior_step)\n # beta_rv{0, (0, 0), floatX, False}.1 [id A]\n # |RandomGeneratorSharedVariable(<Generator(PCG64) at 0x7F77B2831200>) [id B]\n # |TensorConstant{[]} [id C]\n # |TensorConstant{11} [id D]\n # |Elemwise{add,no_inplace} [id E]\n # | |TensorConstant{1.0} [id F]\n # | |y [id G]\n # |Elemwise{sub,no_inplace} [id H]\n # |Elemwise{add,no_inplace} [id I]\n # | |TensorConstant{1.0} [id F]\n # | |TensorConstant{10} [id J]\n # |y [id G]\n\n sample_fn = aesara.function([y_vv], p_posterior_step)\n\nAeMCMC also contains a database of Gibbs samplers that can be used to sample\nsome models more efficiently than a general-purpose sampler like NUTS\nwould:\n\n.. code-block:: python\n\n import aemcmc\n import aesara\n import aesara.tensor as at\n\n srng = at.random.RandomStream(0)\n\n X = at.matrix(\"X\")\n\n # Horseshoe prior for `beta_rv`\n tau_rv = srng.halfcauchy(0, 1, name=\"tau\")\n lmbda_rv = srng.halfcauchy(0, 1, size=X.shape[1], name=\"lambda\")\n beta_rv = srng.normal(0, lmbda_rv * tau_rv, size=X.shape[1], name=\"beta\")\n\n a = at.scalar(\"a\")\n b = at.scalar(\"b\")\n h_rv = srng.gamma(a, b, name=\"h\")\n\n # Negative-binomial regression\n eta = X @ beta_rv\n p = at.sigmoid(-eta)\n Y_rv = srng.nbinom(h_rv, p, name=\"Y\")\n\n y_vv = Y_rv.clone()\n y_vv.name = \"y\"\n\n sampler, initial_values = aemcmc.construct_sampler({Y_rv: y_vv}, srng)\n\n # `sampler.sample_steps` contains the sample step for each random variable\n print(sampler.sample_steps[h_rv])\n # h_posterior\n\n # `sampler.stages` contains the sampling kernels sorted by scan order\n print(sampler.stages)\n # {HorseshoeGibbsKernel: [tau, lambda], NBRegressionGibbsKernel: [beta], DispersionGibbsKernel: [h]}\n\n # Build a function that returns new samples\n to_sample_rvs = [tau_rv, lmbda_rv, beta_rv, h_rv]\n inputs = [a, b, X, y_vv] + [initial_values[rv] for rv in to_sample_rvs]\n outputs = [sampler.sample_steps[rv] for rv in to_sample_rvs]\n sample_fn = aesara.function(inputs, outputs, updates=sampler.updates)\n\n\nIn case no specialized sampler is found, AeMCMC assigns the NUTS sampler to the\nremaining variables. AeMCMC reparametrizes the model automatically to improve\nsampling if needed:\n\n.. code-block:: python\n\n import aemcmc\n import aesara\n import aesara.tensor as at\n\n srng = at.random.RandomStream(0)\n mu_rv = srng.normal(0, 1, name=\"mu\")\n sigma_rv = srng.halfnormal(0.0, 1.0, name=\"sigma\")\n Y_rv = srng.normal(mu_rv, sigma_rv, name=\"Y\")\n\n y_vv = Y_rv.clone()\n\n sampler, initial_values = aemcmc.construct_sampler({Y_rv: y_vv}, srng)\n\n print(sampler.sample_steps.keys())\n # dict_keys([sigma, mu])\n print(sampler.stages)\n # {NUTSKernel: [sigma, mu]}\n print(sampler.parameters)\n # {NUTSKernel: (step_size, inverse_mass_matrix)}\n\n # Build a function that returns new samples\n step_size, inverse_mass_matrix = list(sampler.parameters.values())[0]\n inputs = [\n initial_values[mu_rv],\n initial_values[sigma_rv],\n y_vv,\n step_size,\n inverse_mass_matrix\n ]\n outputs = [sampler.sample_steps[mu_rv], sampler.sample_steps[sigma_rv]]\n sample_fn = aesara.function(inputs, outputs, updates=sampler.updates)\n\n\nInstallation\n============\n\nThe latest release of AeMCMC can be installed from PyPI using ``pip``:\n\n::\n\n pip install aemcmc\n\n\nOr via conda-forge:\n\n::\n\n conda install -c conda-forge aemcmc\n\n\nThe current development branch of AeMCMC can be installed from GitHub, also using ``pip``:\n\n::\n\n pip install git+https://github.com/aesara-devs/aemcmc\n\n\n\n.. |Tests Status| image:: https://github.com/aesara-devs/aemcmc/workflows/Tests/badge.svg\n :target: https://github.com/aesara-devs/aemcmc/actions?query=workflow%3ATests\n.. |Coverage| image:: https://codecov.io/gh/aesara-devs/aemcmc/branch/main/graph/badge.svg?token=45nKZ7fDG5\n :target: https://codecov.io/gh/aesara-devs/aemcmc\n.. |Gitter| image:: https://badges.gitter.im/aesara-devs/aesara.svg\n :target: https://gitter.im/aesara-devs/aesara?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge",
"bugtrack_url": null,
"license": "",
"summary": "Miscellaneous MCMC samplers written in Aesara",
"version": "0.0.7",
"split_keywords": [
"math",
"probability",
"numerical",
"symbolic",
"mcmc"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "bff48395d9036351e9928756e14363e68d72c3d837a61a8b88bb76c45634bf15",
"md5": "c98f2d0ace9112b148b53df65c877ebd",
"sha256": "ecebf996fc5c8e9103fc254cc8d89b6e81aa1f20f505992f845779c17c03d18d"
},
"downloads": -1,
"filename": "aemcmc-0.0.7.tar.gz",
"has_sig": false,
"md5_digest": "c98f2d0ace9112b148b53df65c877ebd",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 43764,
"upload_time": "2023-01-04T22:10:00",
"upload_time_iso_8601": "2023-01-04T22:10:00.527190Z",
"url": "https://files.pythonhosted.org/packages/bf/f4/8395d9036351e9928756e14363e68d72c3d837a61a8b88bb76c45634bf15/aemcmc-0.0.7.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-01-04 22:10:00",
"github": true,
"gitlab": false,
"bitbucket": false,
"github_user": "aesara-devs",
"github_project": "aemcmc",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "versioneer",
"specs": []
},
{
"name": null,
"specs": []
},
{
"name": "coverage",
"specs": [
[
">=",
"5.1"
]
]
},
{
"name": "coveralls",
"specs": []
},
{
"name": "pytest",
"specs": [
[
">=",
"5.0.0"
]
]
},
{
"name": "pytest-cov",
"specs": [
[
">=",
"2.6.1"
]
]
},
{
"name": "pytest-html",
"specs": [
[
">=",
"1.20.0"
]
]
},
{
"name": "pre-commit",
"specs": []
},
{
"name": "polyagamma",
"specs": [
[
">=",
"1.3.2"
]
]
}
],
"lcname": "aemcmc"
}