aemcmc-nightly


Nameaemcmc-nightly JSON
Version 0.0.6 PyPI version JSON
download
home_pagehttp://github.com/aesara-devs/aemcmc
SummaryMiscellaneous MCMC samplers written in Aesara
upload_time2022-11-27 19:13:11
maintainerBrandon T. Willard
docs_urlNone
author
requires_python>=3.7
license
keywords math probability numerical symbolic mcmc
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            |Tests Status| |Coverage| |Gitter|

AeMCMC is a Python library that automates the construction of samplers for `Aesara <https://github.com/pymc-devs/aesara>`_ graphs that represent statistical models.

Features
========

This project is currently in an alpha state, but the basic features/objectives are currently as follows:

- Provide utilities that simplify the process of constructing Aesara graphs/functions for posterior and posterior predictive sampling
- Host a wide array of "exact" posterior sampling steps (e.g. Gibbs steps, scale-mixture/decomposition-based conditional samplers, etc.)
- Build a framework for identifying and composing said sampler steps and enumerating the possible samplers for an arbitrary model

Overall, we would like this project to serve as a hub for community-sourced specialized samplers and facilitate their general use.

Getting started
===============

Using AeMCMC, one can construct sampling steps from a graph containing Aesara
`RandomVariable`\s. AeMCMC analyzes the model graph and possibly rewrites it
to find the most suitable sampler.

AeMCMC can recognize closed-form posteriors; for instance the following
Beta-Binomial model amounts to sampling from a Beta distribution:

.. code-block:: python

    import aesara
    import aemcmc
    import aesara.tensor as at

    srng = at.random.RandomStream(0)

    p_rv = srng.beta(1., 1., name="p")
    Y_rv = srng.binomial(10, p_rv, name="Y")

    y_vv = Y_rv.clone()
    y_vv.name = "y"

    sample_steps, _, initial_values, _ = aemcmc.construct_sampler(
        {Y_rv: y_vv}, srng
    )

    p_posterior_step = sample_steps[p_rv]
    aesara.dprint(p_posterior_step)
    # beta_rv{0, (0, 0), floatX, False}.1 [id A]
    #  |RandomGeneratorSharedVariable(<Generator(PCG64) at 0x7F77B2831200>) [id B]
    #  |TensorConstant{[]} [id C]
    #  |TensorConstant{11} [id D]
    #  |Elemwise{add,no_inplace} [id E]
    #  | |TensorConstant{1.0} [id F]
    #  | |y [id G]
    #  |Elemwise{sub,no_inplace} [id H]
    #    |Elemwise{add,no_inplace} [id I]
    #    | |TensorConstant{1.0} [id F]
    #    | |TensorConstant{10} [id J]
    #    |y [id G]

    sample_fn = aesara.function([y_vv], p_posterior_step)

AeMCMC also contains a database of Gibbs samplers that can be used to sample
some models more efficiently than a general-purpose sampler like NUTS
would:

.. code-block:: python

    import aemcmc
    import aesara.tensor as at

    srng = at.random.RandomStream(0)

    X = at.matrix("X")

    # Horseshoe prior for `beta_rv`
    tau_rv = srng.halfcauchy(0, 1, name="tau")
    lmbda_rv = srng.halfcauchy(0, 1, size=X.shape[1], name="lambda")
    beta_rv = srng.normal(0, lmbda_rv * tau_rv, size=X.shape[1], name="beta")

    a = at.scalar("a")
    b = at.scalar("b")
    h_rv = srng.gamma(a, b, name="h")

    # Negative-binomial regression
    eta = X @ beta_rv
    p = at.sigmoid(-eta)
    Y_rv = srng.nbinom(h_rv, p, name="Y")

    y_vv = Y_rv.clone()
    y_vv.name = "y"

    sample_steps, updates, initial_values, parameters = aemcmc.construct_sampler(
        {Y_rv: y_vv}, srng
    )
    print(sample_steps.keys())
    # dict_keys([tau, lambda, beta, h])


In case no specialized sampler is found, AeMCMC assigns the NUTS sampler to the
remaining variables. AeMCMC reparametrizes the model automatically to improve
sampling if needed:

.. code-block:: python

    import aemcmc
    import aesara.tensor as at

    srng = at.random.RandomStream(0)
    mu_rv = srng.normal(0, 1, name="mu")
    sigma_rv = srng.halfnormal(0.0, 1.0, name="sigma")
    Y_rv = srng.normal(mu_rv, sigma_rv, name="Y")

    y_vv = Y_rv.clone()

    sample_steps, updates, initial_values, parameters = aemcmc.construct_sampler(
        {Y_rv: y_vv}, srng
    )
    print(sample_steps.keys())
    # dict_keys([sigma, mu])
    print(parameters.keys())
    # dict_keys(['step_size', 'inverse_mass_matrix'])


Installation
============

The latest release of AeMCMC can be installed from PyPI using ``pip``:

::

    pip install aemcmc


Or via conda-forge:

::

    conda install -c conda-forge aemcmc


The current development branch of AeMCMC can be installed from GitHub, also using ``pip``:

::

    pip install git+https://github.com/aesara-devs/aemcmc



.. |Tests Status| image:: https://github.com/aesara-devs/aemcmc/workflows/Tests/badge.svg
  :target: https://github.com/aesara-devs/aemcmc/actions?query=workflow%3ATests
.. |Coverage| image:: https://codecov.io/gh/aesara-devs/aemcmc/branch/main/graph/badge.svg?token=45nKZ7fDG5
  :target: https://codecov.io/gh/aesara-devs/aemcmc
.. |Gitter| image:: https://badges.gitter.im/aesara-devs/aesara.svg
  :target: https://gitter.im/aesara-devs/aesara?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge

            

Raw data

            {
    "_id": null,
    "home_page": "http://github.com/aesara-devs/aemcmc",
    "name": "aemcmc-nightly",
    "maintainer": "Brandon T. Willard",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "aesara-devs@gmail.com",
    "keywords": "math,probability,numerical,symbolic,MCMC",
    "author": "",
    "author_email": "",
    "download_url": "",
    "platform": null,
    "description": "|Tests Status| |Coverage| |Gitter|\n\nAeMCMC is a Python library that automates the construction of samplers for `Aesara <https://github.com/pymc-devs/aesara>`_ graphs that represent statistical models.\n\nFeatures\n========\n\nThis project is currently in an alpha state, but the basic features/objectives are currently as follows:\n\n- Provide utilities that simplify the process of constructing Aesara graphs/functions for posterior and posterior predictive sampling\n- Host a wide array of \"exact\" posterior sampling steps (e.g. Gibbs steps, scale-mixture/decomposition-based conditional samplers, etc.)\n- Build a framework for identifying and composing said sampler steps and enumerating the possible samplers for an arbitrary model\n\nOverall, we would like this project to serve as a hub for community-sourced specialized samplers and facilitate their general use.\n\nGetting started\n===============\n\nUsing AeMCMC, one can construct sampling steps from a graph containing Aesara\n`RandomVariable`\\s. AeMCMC analyzes the model graph and possibly rewrites it\nto find the most suitable sampler.\n\nAeMCMC can recognize closed-form posteriors; for instance the following\nBeta-Binomial model amounts to sampling from a Beta distribution:\n\n.. code-block:: python\n\n    import aesara\n    import aemcmc\n    import aesara.tensor as at\n\n    srng = at.random.RandomStream(0)\n\n    p_rv = srng.beta(1., 1., name=\"p\")\n    Y_rv = srng.binomial(10, p_rv, name=\"Y\")\n\n    y_vv = Y_rv.clone()\n    y_vv.name = \"y\"\n\n    sample_steps, _, initial_values, _ = aemcmc.construct_sampler(\n        {Y_rv: y_vv}, srng\n    )\n\n    p_posterior_step = sample_steps[p_rv]\n    aesara.dprint(p_posterior_step)\n    # beta_rv{0, (0, 0), floatX, False}.1 [id A]\n    #  |RandomGeneratorSharedVariable(<Generator(PCG64) at 0x7F77B2831200>) [id B]\n    #  |TensorConstant{[]} [id C]\n    #  |TensorConstant{11} [id D]\n    #  |Elemwise{add,no_inplace} [id E]\n    #  | |TensorConstant{1.0} [id F]\n    #  | |y [id G]\n    #  |Elemwise{sub,no_inplace} [id H]\n    #    |Elemwise{add,no_inplace} [id I]\n    #    | |TensorConstant{1.0} [id F]\n    #    | |TensorConstant{10} [id J]\n    #    |y [id G]\n\n    sample_fn = aesara.function([y_vv], p_posterior_step)\n\nAeMCMC also contains a database of Gibbs samplers that can be used to sample\nsome models more efficiently than a general-purpose sampler like NUTS\nwould:\n\n.. code-block:: python\n\n    import aemcmc\n    import aesara.tensor as at\n\n    srng = at.random.RandomStream(0)\n\n    X = at.matrix(\"X\")\n\n    # Horseshoe prior for `beta_rv`\n    tau_rv = srng.halfcauchy(0, 1, name=\"tau\")\n    lmbda_rv = srng.halfcauchy(0, 1, size=X.shape[1], name=\"lambda\")\n    beta_rv = srng.normal(0, lmbda_rv * tau_rv, size=X.shape[1], name=\"beta\")\n\n    a = at.scalar(\"a\")\n    b = at.scalar(\"b\")\n    h_rv = srng.gamma(a, b, name=\"h\")\n\n    # Negative-binomial regression\n    eta = X @ beta_rv\n    p = at.sigmoid(-eta)\n    Y_rv = srng.nbinom(h_rv, p, name=\"Y\")\n\n    y_vv = Y_rv.clone()\n    y_vv.name = \"y\"\n\n    sample_steps, updates, initial_values, parameters = aemcmc.construct_sampler(\n        {Y_rv: y_vv}, srng\n    )\n    print(sample_steps.keys())\n    # dict_keys([tau, lambda, beta, h])\n\n\nIn case no specialized sampler is found, AeMCMC assigns the NUTS sampler to the\nremaining variables. AeMCMC reparametrizes the model automatically to improve\nsampling if needed:\n\n.. code-block:: python\n\n    import aemcmc\n    import aesara.tensor as at\n\n    srng = at.random.RandomStream(0)\n    mu_rv = srng.normal(0, 1, name=\"mu\")\n    sigma_rv = srng.halfnormal(0.0, 1.0, name=\"sigma\")\n    Y_rv = srng.normal(mu_rv, sigma_rv, name=\"Y\")\n\n    y_vv = Y_rv.clone()\n\n    sample_steps, updates, initial_values, parameters = aemcmc.construct_sampler(\n        {Y_rv: y_vv}, srng\n    )\n    print(sample_steps.keys())\n    # dict_keys([sigma, mu])\n    print(parameters.keys())\n    # dict_keys(['step_size', 'inverse_mass_matrix'])\n\n\nInstallation\n============\n\nThe latest release of AeMCMC can be installed from PyPI using ``pip``:\n\n::\n\n    pip install aemcmc\n\n\nOr via conda-forge:\n\n::\n\n    conda install -c conda-forge aemcmc\n\n\nThe current development branch of AeMCMC can be installed from GitHub, also using ``pip``:\n\n::\n\n    pip install git+https://github.com/aesara-devs/aemcmc\n\n\n\n.. |Tests Status| image:: https://github.com/aesara-devs/aemcmc/workflows/Tests/badge.svg\n  :target: https://github.com/aesara-devs/aemcmc/actions?query=workflow%3ATests\n.. |Coverage| image:: https://codecov.io/gh/aesara-devs/aemcmc/branch/main/graph/badge.svg?token=45nKZ7fDG5\n  :target: https://codecov.io/gh/aesara-devs/aemcmc\n.. |Gitter| image:: https://badges.gitter.im/aesara-devs/aesara.svg\n  :target: https://gitter.im/aesara-devs/aesara?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Miscellaneous MCMC samplers written in Aesara",
    "version": "0.0.6",
    "project_urls": {
        "Homepage": "http://github.com/aesara-devs/aemcmc"
    },
    "split_keywords": [
        "math",
        "probability",
        "numerical",
        "symbolic",
        "mcmc"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "57c467d443b9da09b4917f98e0d0d8a8fff66867e9711df0d6fd6c317aa528e5",
                "md5": "9ff5ff5af31333b31782aec3c123b950",
                "sha256": "49c252813983dbe299912b5af18c6722dcae86a53197e2c7a3e02b3dedf26921"
            },
            "downloads": -1,
            "filename": "aemcmc_nightly-0.0.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9ff5ff5af31333b31782aec3c123b950",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 25631,
            "upload_time": "2022-11-27T19:13:11",
            "upload_time_iso_8601": "2022-11-27T19:13:11.927558Z",
            "url": "https://files.pythonhosted.org/packages/57/c4/67d443b9da09b4917f98e0d0d8a8fff66867e9711df0d6fd6c317aa528e5/aemcmc_nightly-0.0.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2022-11-27 19:13:11",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "aesara-devs",
    "github_project": "aemcmc",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "aemcmc-nightly"
}
        
Elapsed time: 0.18203s