monte-library


Namemonte-library JSON
Version 0.2.0 PyPI version JSON
download
home_pagehttps://github.com/draktr/monte-library
Summarymonte-library is a set of Monte Carlo methods in Python. The package is written to be flexible, clear to understand and encompass variety of Monte Carlo methods.
upload_time2024-09-18 01:55:30
maintainerNone
docs_urlNone
authordraktr
requires_python>=3.6
licenseMIT License
keywords montecarlo monte carlo optimization integration sampling mcmc hmc simulation modelling
VCS
bugtrack_url
requirements monte-library numpy scipy pandas matplotlib seaborn statsmodels sphinx_rtd_theme
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Monte Library

`monte-library` is a set of Monte Carlo methods in Python. The package is written to be flexible, clear to understand and encompass variety of Monte Carlo methods.

* Free software: MIT license
* Documentation: <https://monte-library.readthedocs.io/en/latest/>

## Installation

Preferred method to install `monte-library` is through Python's package installer pip. To install `monte-library`, run this command in your terminal

```sh {"id":"01J7YPVZY7NE5CNA8881H6KYRX"}
pip install monte-library

```

Alternatively, you can install the package directly from GitHub:

```sh {"id":"01J7YPVZY7NE5CNA8883TKPRRW"}
git clone -b development https://github.com/draktr/monte-library.git
cd monte-library
python setup.py install

```

## Features

### Base module

* saving samples and log probability values as `.csv` file
* posterior mean, standard deviation and quantiles
* diagnostic tools: effective sample size (ESS), autocorrelation plots, ergodic mean plots, acceptance rate, k-fold histograms, Gelman-Rubin statistic (R-hat), Multivariate Gelman-Rubin Statistic (R-hat), Geweke diagnostic, Heidelberger-Welch diagnostic, Raftery-Lewis diagnostic, Markov chain stationarity tests
* visualizations: histograms, kernel density estimation plots, traceplots

### General Monte Carlo Methods

* multidimensional Monte Carlo integration
* multidimensional rejection sampling
* multidimensional importance sampling

### Markov Chain Monte Carlo Modelling Methods

* symmetric proposal Metropolis algorithm
* Metropolis-Hastings algorithm
* Gibbs sampler
* vanilla Hamiltonian Monte Carlo

## Advantages

* **FLEXIBLE** - the package allows users to use various existing Monte Carlo methods for their needs without needing to write the whole algorithm. At the same time, `monte-library` allows users to specify their own hyperparameters, posterior and proposal distributions as needed. Furthermore `BaseSampler` class can be used as parent class for any proprietary Monte Carlo algorithm thus utilizing its features for visualizations, posterior analysis and convergence checks.
* **SIMPLE AND CLEAR CODE BASE** - code was intentionally kept simple to be understandable to those with limited exposure to Statistical Computing. `monte-library` is a great tool to supplement learning as it generally matches mathematical formulations of algorithms and simple syntax helps focus on the algorithm itself.
* **COMPREHENSIVE** - includes Monte Calor methods for various applications. Bayesian modelling methods include both classical methods (Metropolis algorithm) as well as more advanced methods such as Hamiltonian Monte Carlo.

## Usage

Package contains variety of Monte Carlo methods that can be applied to problems ranging from integration to modelling. Importantly, code is both simple and generalized as to match the respective mathematical formulations of algorithms. As such it can be a great supplement when learning these topics. Finally, the package is flexible and `BaseSampler` class can be used as a parent class to any user-defined sampler. Furthermore, it is easy to modify existing algorithms with proprietary improvements.

### Example 1: Monte Carlo Integration

The following example is a simple Monte Carlo implementation to solve the following integral:
$$\int_{-3}^{3} \int_{-3}^{3} x^2 + y^3 dxdy$$

```python {"id":"01J7YPVZY8V7N7020MGVZT89BX"}
from monte import integrator

def integrand(args):
        return args[0] ** 2 + args[1] ** 3

result = integrator(integrand, lower_bounds=[-3, -3], upper_bounds=[3, 3], n=10000000)
result


```

### Example 2: Bayesian Linear Regression with Metropolis Algorithm

Example 2 is using Metropolis algorithm (with Gaussian proposal) to estimate parameters of a multivariate linear regression.

```python {"id":"01J7YPVZY8V7N7020MGZW4XSP1"}
import numpy as np
from scipy import stats
from monte import GaussianMetropolis

# First, we generate some data
true_theta = np.array([5, 10, 2, 2, 4])
n = 1000
x = np.zeros((n, 4))
x[:, 0] = np.repeat(1, n)
x[:, 1:4] = stats.norm(loc=0, scale=1).rvs(size=(n, 3))

mu = np.matmul(x, true_theta[0:-1])
y = stats.norm(loc=mu, scale=true_theta[-1]).rvs(size=n)

# Define the posterior
def posterior(theta, x, y):

    beta_prior = stats.multivariate_normal(
        mean=np.repeat(0, len(theta[0:-1])),
        cov=np.diag(np.repeat(30, len(theta[0:-1]))),
    ).logpdf(theta[0:-1])
    sd_prior = stats.uniform(loc=0, scale=30).logpdf(theta[-1])

    mu = np.matmul(x, theta[0:-1])
    likelihood = np.sum(stats.norm(loc=mu, scale=theta[-1]).logpdf(y))

    return beta_prior + sd_prior + likelihood

# Lastly, we sample
gaussian_sampler = GaussianMetropolis(posterior)
gaussian_sampler.sample(
    iter=10000,
    warmup=5000,
    theta=np.array([0, 0, 0, 0, 1]),
    step_size=1,
    lag=1,
    x=x,
    y=y,
    )


```

Using methods from the `BaseSampler` class we can perform posterior analytics. These are some of the analytics methods:

```python {"id":"01J7YPVZY8V7N7020MH2JD85ZM"}
# Checking parameter estimates and their credible intervals
gaussian_sampler.mean()
gaussian_sampler.credible_interval()

# Checking Metropolis acceptance rate
gaussian_sampler.acceptance_rate()

# Plotting KDE plot with histogram
gaussian_sampler.parameter_kde()

# Plotting traceplots and ergodic means, and calculating effective sample sizes as convergence diagnostics
gaussian_sampler.traceplots()
gaussian_sampler.plot_ergodic_mean()

```

### Example 3: Sampling from a Multivariate Distribution using Hamiltonian Monte Carlo

In the following example we use Hamiltonian Monte Carlo (HMC) algorithm to sample from a distribution. Note that this is a toy example, and HMC is more appropriate to be used for higher-dimensional model parameter estimation. Also note that analytical gradient is not necessary.

```python {"id":"01J7YPVZY8V7N7020MH433XS59"}
import numpy as np
from monte import HamiltonianMC

# Defining the distribution that we are going to sample from...
def posterior(theta):
    return -0.5 * np.sum(theta**2)

# ... and its gradient
def posterior_gradient(theta):
    return -theta

# Sampling
sampler = HamiltonianMC(posterior, posterior_gradient)
sampler.sample(
    iter=10000,
    warmup=10,
    theta=np.array([8.0, -3.0]),
    epsilon=0.01,
    l=10,
    metric=None,
    lag=1,
    )

```

## Alternatives and Complements

There are more sophisticated and computationally efficient implementations of Monte Carlo methods for off-the-shelf solutions

* [ArviZ](https://www.arviz.org/en/latest/) - independent library for exploratory analysis of Bayesian models
* [vegas](https://github.com/gplepage/vegas) - uses improved version of the adaptive Monte Carlo vegas algorithm
* [OpenBUGS](https://www.mrc-bsu.cam.ac.uk/software/bugs/openbugs/) - open source implementation of BUGS language utilizing Gibbs sampler
* [JAGS](https://mcmc-jags.sourceforge.io/) - cross-platform and more extensible implementation of BUGS language
* [WinBUGS](https://www.mrc-bsu.cam.ac.uk/software/bugs/the-bugs-project-winbugs/) - software for Bayesian analysis utilizing Gibbs sampler (available, but discontinued in favour of OpenBUGS)
* [Stan](https://mc-stan.org/) - state-of-the-art probabilistic language implementing advanced version of No-U-Turn Sampler
* [PyMC](https://github.com/pymc-devs/pymc) - supports HMC and Metropolis-Hastings algorithms, as well as Sequential Monte Carlo methods

## Project Principles

* Easy to be understood and used by non-mathematicians
* Potential to be used as pedagogical tool
* Easy to modify algorithms with proprietary improvements
* Flexibility and simplicity over computational efficiency
* Tested
* Dedicated documentation
* Formatting deferred to [Black](https://github.com/psf/black)

## Future Development

Feel free to reach out through Issues forum if you wish to add features or help in any other way. If there are any issues, bugs or improvement recommendations, please raise them in the forum. Especially reach out if you want to contribute with any of the possible features listed below.

### Possible Future Features

#### In `BaseSampler` Class

* Sampling trace animation
* ECDF plot
* Forrest plot of parameter estimates with credible intervals
* Monte Carlo Error
* Support for multiple chains

#### Monte Carlo Methods

* Slice sampling
* Annealed importance sampling
* Component-wise Metropolis
* Independent Metropolis
* Wang-Landau algorithm
* Monte Carlo tree search
* Direct Sampling Monte Carlo
* Monte Carlo statistical distribution test

## Further Reading

The following is the non-exhaustive list of useful sources for learning more about Monte Carlo methods. Some of the code in `monte` has been written based on mathematical formulae from some of these sources.

### General

[1] Ntzoufras, I. (2009). Bayesian Modelling Using WinBUGS. Wiley.   
[2] Metropolis, N., & Ulam, S. (1949). The Monte Carlo Method. Journal of the American Statistical Association, 44(247), 335–341. <https://doi.org/10.1080/01621459.1949.10483310>

### Metropolis

[3] Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H., & Teller, E. (2004). Equation of State Calculations by Fast Computing Machines. The Journal of Chemical Physics, 21(6), 1087. <https://doi.org/10.1063/1.1699114>   
[4] Hastings, W. K. (1970). Monte Carlo sampling methods using Markov chains and their applications. Biometrika, 57(1), 97–109. <https://doi.org/10.1093/BIOMET/57.1.97>   
[5] Hartig, F. (n.d.). A simple Metropolis-Hastings MCMC in R | theoretical ecology. Retrieved February 15, 2023, from <https://theoreticalecology.wordpress.com/2010/09/17/metropolis-hastings-mcmc-in-r/>   
[6] Dirty Quant @YouTube. (n.d.). The Metropolis-Hastings Algorithm (MCMC in Python) - YouTube. Retrieved February 15, 2023, from <https://www.youtube.com/watch?v=MxI78mpq_44>   
[7] TWEAG Software Innovation Lab. (n.d.). Markov chain Monte Carlo (MCMC) Sampling, Part 1: The Basics - Tweag. Retrieved February 15, 2023, from <https://www.tweag.io/blog/2019-10-25-mcmc-intro1/>   
[8] Urbanevych, V. (n.d.). VU | Bayesian linear regression and Metropolis-Hastings sampler. Retrieved February 15, 2023, from <https://vitaliiur.github.io/blog/2021/linreg/>

### Gibbs Sampler

[9] Geman, S., & Geman, D. (1984). Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-6(6), 721–741. <https://doi.org/10.1109/TPAMI.1984.4767596>   
[10] Campbell, K. R. (n.d.). Gibbs sampling for Bayesian linear regression in Python | Kieran R Campbell - blog. Retrieved February 15, 2023, from <https://kieranrcampbell.github.io/blog/2016/05/15/gibbs-sampling-bayesian-linear-regression.html>

### Hamiltonian Monte Carlo

[11] Betancourt, M. (2017). A Conceptual Introduction to Hamiltonian Monte Carlo. <https://doi.org/10.48550/arxiv.1701.02434>   
[12] Neal, R. M. (2012). MCMC using Hamiltonian dynamics. Handbook of Markov Chain Monte Carlo, 1–592. <https://doi.org/10.1201/b10905>   
[13] Stan. (n.d.). 15.1 Hamiltonian Monte Carlo | Stan Reference Manual. Retrieved February 15, 2023, from <https://mc-stan.org/docs/reference-manual/hamiltonian-monte-carlo.html>   
[14] Clark, M. (n.d.). Hamiltonian Monte Carlo | Model Estimation by Example. Retrieved February 15, 2023, from <https://m-clark.github.io/models-by-example/hamiltonian-monte-carlo.html>   
[15] Richard. (n.d.). Markov Chains: Why Walk When You Can Flow? | Elements of Evolutionary Anthropology. Retrieved February 15, 2023, from <http://elevanth.org/blog/2017/11/28/build-a-better-markov-chain/>

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/draktr/monte-library",
    "name": "monte-library",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": "montecarlo monte carlo, optimization, integration, sampling, mcmc, hmc, simulation, modelling",
    "author": "draktr",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/3c/f3/a0d15f519f87ae2171bb92348d291d4390b913e97ac449154ecf23aa476d/monte-library-0.2.0.tar.gz",
    "platform": null,
    "description": "# Monte Library\r\n\r\n`monte-library` is a set of Monte Carlo methods in Python. The package is written to be flexible, clear to understand and encompass variety of Monte Carlo methods.\r\n\r\n* Free software: MIT license\r\n* Documentation: <https://monte-library.readthedocs.io/en/latest/>\r\n\r\n## Installation\r\n\r\nPreferred method to install `monte-library` is through Python's package installer pip. To install `monte-library`, run this command in your terminal\r\n\r\n```sh {\"id\":\"01J7YPVZY7NE5CNA8881H6KYRX\"}\r\npip install monte-library\r\n\r\n```\r\n\r\nAlternatively, you can install the package directly from GitHub:\r\n\r\n```sh {\"id\":\"01J7YPVZY7NE5CNA8883TKPRRW\"}\r\ngit clone -b development https://github.com/draktr/monte-library.git\r\ncd monte-library\r\npython setup.py install\r\n\r\n```\r\n\r\n## Features\r\n\r\n### Base module\r\n\r\n* saving samples and log probability values as `.csv` file\r\n* posterior mean, standard deviation and quantiles\r\n* diagnostic tools: effective sample size (ESS), autocorrelation plots, ergodic mean plots, acceptance rate, k-fold histograms, Gelman-Rubin statistic (R-hat), Multivariate Gelman-Rubin Statistic (R-hat), Geweke diagnostic, Heidelberger-Welch diagnostic, Raftery-Lewis diagnostic, Markov chain stationarity tests\r\n* visualizations: histograms, kernel density estimation plots, traceplots\r\n\r\n### General Monte Carlo Methods\r\n\r\n* multidimensional Monte Carlo integration\r\n* multidimensional rejection sampling\r\n* multidimensional importance sampling\r\n\r\n### Markov Chain Monte Carlo Modelling Methods\r\n\r\n* symmetric proposal Metropolis algorithm\r\n* Metropolis-Hastings algorithm\r\n* Gibbs sampler\r\n* vanilla Hamiltonian Monte Carlo\r\n\r\n## Advantages\r\n\r\n* **FLEXIBLE** - the package allows users to use various existing Monte Carlo methods for their needs without needing to write the whole algorithm. At the same time, `monte-library` allows users to specify their own hyperparameters, posterior and proposal distributions as needed. Furthermore `BaseSampler` class can be used as parent class for any proprietary Monte Carlo algorithm thus utilizing its features for visualizations, posterior analysis and convergence checks.\r\n* **SIMPLE AND CLEAR CODE BASE** - code was intentionally kept simple to be understandable to those with limited exposure to Statistical Computing. `monte-library` is a great tool to supplement learning as it generally matches mathematical formulations of algorithms and simple syntax helps focus on the algorithm itself.\r\n* **COMPREHENSIVE** - includes Monte Calor methods for various applications. Bayesian modelling methods include both classical methods (Metropolis algorithm) as well as more advanced methods such as Hamiltonian Monte Carlo.\r\n\r\n## Usage\r\n\r\nPackage contains variety of Monte Carlo methods that can be applied to problems ranging from integration to modelling. Importantly, code is both simple and generalized as to match the respective mathematical formulations of algorithms. As such it can be a great supplement when learning these topics. Finally, the package is flexible and `BaseSampler` class can be used as a parent class to any user-defined sampler. Furthermore, it is easy to modify existing algorithms with proprietary improvements.\r\n\r\n### Example 1: Monte Carlo Integration\r\n\r\nThe following example is a simple Monte Carlo implementation to solve the following integral:\r\n$$\\int_{-3}^{3} \\int_{-3}^{3} x^2 + y^3 dxdy$$\r\n\r\n```python {\"id\":\"01J7YPVZY8V7N7020MGVZT89BX\"}\r\nfrom monte import integrator\r\n\r\ndef integrand(args):\r\n        return args[0] ** 2 + args[1] ** 3\r\n\r\nresult = integrator(integrand, lower_bounds=[-3, -3], upper_bounds=[3, 3], n=10000000)\r\nresult\r\n\r\n\r\n```\r\n\r\n### Example 2: Bayesian Linear Regression with Metropolis Algorithm\r\n\r\nExample 2 is using Metropolis algorithm (with Gaussian proposal) to estimate parameters of a multivariate linear regression.\r\n\r\n```python {\"id\":\"01J7YPVZY8V7N7020MGZW4XSP1\"}\r\nimport numpy as np\r\nfrom scipy import stats\r\nfrom monte import GaussianMetropolis\r\n\r\n# First, we generate some data\r\ntrue_theta = np.array([5, 10, 2, 2, 4])\r\nn = 1000\r\nx = np.zeros((n, 4))\r\nx[:, 0] = np.repeat(1, n)\r\nx[:, 1:4] = stats.norm(loc=0, scale=1).rvs(size=(n, 3))\r\n\r\nmu = np.matmul(x, true_theta[0:-1])\r\ny = stats.norm(loc=mu, scale=true_theta[-1]).rvs(size=n)\r\n\r\n# Define the posterior\r\ndef posterior(theta, x, y):\r\n\r\n    beta_prior = stats.multivariate_normal(\r\n        mean=np.repeat(0, len(theta[0:-1])),\r\n        cov=np.diag(np.repeat(30, len(theta[0:-1]))),\r\n    ).logpdf(theta[0:-1])\r\n    sd_prior = stats.uniform(loc=0, scale=30).logpdf(theta[-1])\r\n\r\n    mu = np.matmul(x, theta[0:-1])\r\n    likelihood = np.sum(stats.norm(loc=mu, scale=theta[-1]).logpdf(y))\r\n\r\n    return beta_prior + sd_prior + likelihood\r\n\r\n# Lastly, we sample\r\ngaussian_sampler = GaussianMetropolis(posterior)\r\ngaussian_sampler.sample(\r\n    iter=10000,\r\n    warmup=5000,\r\n    theta=np.array([0, 0, 0, 0, 1]),\r\n    step_size=1,\r\n    lag=1,\r\n    x=x,\r\n    y=y,\r\n    )\r\n\r\n\r\n```\r\n\r\nUsing methods from the `BaseSampler` class we can perform posterior analytics. These are some of the analytics methods:\r\n\r\n```python {\"id\":\"01J7YPVZY8V7N7020MH2JD85ZM\"}\r\n# Checking parameter estimates and their credible intervals\r\ngaussian_sampler.mean()\r\ngaussian_sampler.credible_interval()\r\n\r\n# Checking Metropolis acceptance rate\r\ngaussian_sampler.acceptance_rate()\r\n\r\n# Plotting KDE plot with histogram\r\ngaussian_sampler.parameter_kde()\r\n\r\n# Plotting traceplots and ergodic means, and calculating effective sample sizes as convergence diagnostics\r\ngaussian_sampler.traceplots()\r\ngaussian_sampler.plot_ergodic_mean()\r\n\r\n```\r\n\r\n### Example 3: Sampling from a Multivariate Distribution using Hamiltonian Monte Carlo\r\n\r\nIn the following example we use Hamiltonian Monte Carlo (HMC) algorithm to sample from a distribution. Note that this is a toy example, and HMC is more appropriate to be used for higher-dimensional model parameter estimation. Also note that analytical gradient is not necessary.\r\n\r\n```python {\"id\":\"01J7YPVZY8V7N7020MH433XS59\"}\r\nimport numpy as np\r\nfrom monte import HamiltonianMC\r\n\r\n# Defining the distribution that we are going to sample from...\r\ndef posterior(theta):\r\n    return -0.5 * np.sum(theta**2)\r\n\r\n# ... and its gradient\r\ndef posterior_gradient(theta):\r\n    return -theta\r\n\r\n# Sampling\r\nsampler = HamiltonianMC(posterior, posterior_gradient)\r\nsampler.sample(\r\n    iter=10000,\r\n    warmup=10,\r\n    theta=np.array([8.0, -3.0]),\r\n    epsilon=0.01,\r\n    l=10,\r\n    metric=None,\r\n    lag=1,\r\n    )\r\n\r\n```\r\n\r\n## Alternatives and Complements\r\n\r\nThere are more sophisticated and computationally efficient implementations of Monte Carlo methods for off-the-shelf solutions\r\n\r\n* [ArviZ](https://www.arviz.org/en/latest/) - independent library for exploratory analysis of Bayesian models\r\n* [vegas](https://github.com/gplepage/vegas) - uses improved version of the adaptive Monte Carlo vegas algorithm\r\n* [OpenBUGS](https://www.mrc-bsu.cam.ac.uk/software/bugs/openbugs/) - open source implementation of BUGS language utilizing Gibbs sampler\r\n* [JAGS](https://mcmc-jags.sourceforge.io/) - cross-platform and more extensible implementation of BUGS language\r\n* [WinBUGS](https://www.mrc-bsu.cam.ac.uk/software/bugs/the-bugs-project-winbugs/) - software for Bayesian analysis utilizing Gibbs sampler (available, but discontinued in favour of OpenBUGS)\r\n* [Stan](https://mc-stan.org/) - state-of-the-art probabilistic language implementing advanced version of No-U-Turn Sampler\r\n* [PyMC](https://github.com/pymc-devs/pymc) - supports HMC and Metropolis-Hastings algorithms, as well as Sequential Monte Carlo methods\r\n\r\n## Project Principles\r\n\r\n* Easy to be understood and used by non-mathematicians\r\n* Potential to be used as pedagogical tool\r\n* Easy to modify algorithms with proprietary improvements\r\n* Flexibility and simplicity over computational efficiency\r\n* Tested\r\n* Dedicated documentation\r\n* Formatting deferred to [Black](https://github.com/psf/black)\r\n\r\n## Future Development\r\n\r\nFeel free to reach out through Issues forum if you wish to add features or help in any other way. If there are any issues, bugs or improvement recommendations, please raise them in the forum. Especially reach out if you want to contribute with any of the possible features listed below.\r\n\r\n### Possible Future Features\r\n\r\n#### In `BaseSampler` Class\r\n\r\n* Sampling trace animation\r\n* ECDF plot\r\n* Forrest plot of parameter estimates with credible intervals\r\n* Monte Carlo Error\r\n* Support for multiple chains\r\n\r\n#### Monte Carlo Methods\r\n\r\n* Slice sampling\r\n* Annealed importance sampling\r\n* Component-wise Metropolis\r\n* Independent Metropolis\r\n* Wang-Landau algorithm\r\n* Monte Carlo tree search\r\n* Direct Sampling Monte Carlo\r\n* Monte Carlo statistical distribution test\r\n\r\n## Further Reading\r\n\r\nThe following is the non-exhaustive list of useful sources for learning more about Monte Carlo methods. Some of the code in `monte` has been written based on mathematical formulae from some of these sources.\r\n\r\n### General\r\n\r\n[1] Ntzoufras, I. (2009). Bayesian Modelling Using WinBUGS. Wiley.   \r\n[2] Metropolis, N., & Ulam, S. (1949). The Monte Carlo Method. Journal of the American Statistical Association, 44(247), 335\u00e2\u20ac\u201c341. <https://doi.org/10.1080/01621459.1949.10483310>\r\n\r\n### Metropolis\r\n\r\n[3] Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H., & Teller, E. (2004). Equation of State Calculations by Fast Computing Machines. The Journal of Chemical Physics, 21(6), 1087. <https://doi.org/10.1063/1.1699114>   \r\n[4] Hastings, W. K. (1970). Monte Carlo sampling methods using Markov chains and their applications. Biometrika, 57(1), 97\u00e2\u20ac\u201c109. <https://doi.org/10.1093/BIOMET/57.1.97>   \r\n[5] Hartig, F. (n.d.). A simple Metropolis-Hastings MCMC in R | theoretical ecology. Retrieved February 15, 2023, from <https://theoreticalecology.wordpress.com/2010/09/17/metropolis-hastings-mcmc-in-r/>   \r\n[6] Dirty Quant @YouTube. (n.d.). The Metropolis-Hastings Algorithm (MCMC in Python) - YouTube. Retrieved February 15, 2023, from <https://www.youtube.com/watch?v=MxI78mpq_44>   \r\n[7] TWEAG Software Innovation Lab. (n.d.). Markov chain Monte Carlo (MCMC) Sampling, Part 1: The Basics - Tweag. Retrieved February 15, 2023, from <https://www.tweag.io/blog/2019-10-25-mcmc-intro1/>   \r\n[8] Urbanevych, V. (n.d.). VU | Bayesian linear regression and Metropolis-Hastings sampler. Retrieved February 15, 2023, from <https://vitaliiur.github.io/blog/2021/linreg/>\r\n\r\n### Gibbs Sampler\r\n\r\n[9] Geman, S., & Geman, D. (1984). Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-6(6), 721\u00e2\u20ac\u201c741. <https://doi.org/10.1109/TPAMI.1984.4767596>   \r\n[10] Campbell, K. R. (n.d.). Gibbs sampling for Bayesian linear regression in Python | Kieran R Campbell - blog. Retrieved February 15, 2023, from <https://kieranrcampbell.github.io/blog/2016/05/15/gibbs-sampling-bayesian-linear-regression.html>\r\n\r\n### Hamiltonian Monte Carlo\r\n\r\n[11] Betancourt, M. (2017). A Conceptual Introduction to Hamiltonian Monte Carlo. <https://doi.org/10.48550/arxiv.1701.02434>   \r\n[12] Neal, R. M. (2012). MCMC using Hamiltonian dynamics. Handbook of Markov Chain Monte Carlo, 1\u00e2\u20ac\u201c592. <https://doi.org/10.1201/b10905>   \r\n[13] Stan. (n.d.). 15.1 Hamiltonian Monte Carlo | Stan Reference Manual. Retrieved February 15, 2023, from <https://mc-stan.org/docs/reference-manual/hamiltonian-monte-carlo.html>   \r\n[14] Clark, M. (n.d.). Hamiltonian Monte Carlo | Model Estimation by Example. Retrieved February 15, 2023, from <https://m-clark.github.io/models-by-example/hamiltonian-monte-carlo.html>   \r\n[15] Richard. (n.d.). Markov Chains: Why Walk When You Can Flow? | Elements of Evolutionary Anthropology. Retrieved February 15, 2023, from <http://elevanth.org/blog/2017/11/28/build-a-better-markov-chain/>\r\n",
    "bugtrack_url": null,
    "license": "MIT License",
    "summary": "monte-library is a set of Monte Carlo methods in Python. The package is written to be flexible, clear to understand and encompass variety of Monte Carlo methods.",
    "version": "0.2.0",
    "project_urls": {
        "Documentation": "https://monte-library.readthedocs.io/en/latest/",
        "Homepage": "https://github.com/draktr/monte-library",
        "Issues": "https://github.com/draktr/monte-library/issues"
    },
    "split_keywords": [
        "montecarlo monte carlo",
        " optimization",
        " integration",
        " sampling",
        " mcmc",
        " hmc",
        " simulation",
        " modelling"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3cf3a0d15f519f87ae2171bb92348d291d4390b913e97ac449154ecf23aa476d",
                "md5": "256336cf33a9ed9460e15387655c0f76",
                "sha256": "045403b13ff8ce5cc39fc0c09c8bec8202bc599ec191e46c3525e228faa5f2c9"
            },
            "downloads": -1,
            "filename": "monte-library-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "256336cf33a9ed9460e15387655c0f76",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 31258,
            "upload_time": "2024-09-18T01:55:30",
            "upload_time_iso_8601": "2024-09-18T01:55:30.058042Z",
            "url": "https://files.pythonhosted.org/packages/3c/f3/a0d15f519f87ae2171bb92348d291d4390b913e97ac449154ecf23aa476d/monte-library-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-18 01:55:30",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "draktr",
    "github_project": "monte-library",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "monte-library",
            "specs": [
                [
                    ">=",
                    "0.2.0"
                ]
            ]
        },
        {
            "name": "numpy",
            "specs": [
                [
                    ">=",
                    "1.21.6"
                ]
            ]
        },
        {
            "name": "scipy",
            "specs": [
                [
                    ">=",
                    "1.7.3"
                ]
            ]
        },
        {
            "name": "pandas",
            "specs": [
                [
                    ">=",
                    "1.3.5"
                ]
            ]
        },
        {
            "name": "matplotlib",
            "specs": [
                [
                    ">=",
                    "3.5.3"
                ]
            ]
        },
        {
            "name": "seaborn",
            "specs": [
                [
                    ">=",
                    "0.12.0"
                ]
            ]
        },
        {
            "name": "statsmodels",
            "specs": [
                [
                    ">=",
                    "0.13.2"
                ]
            ]
        },
        {
            "name": "sphinx_rtd_theme",
            "specs": [
                [
                    "==",
                    "1.2.0"
                ]
            ]
        }
    ],
    "lcname": "monte-library"
}
        
Elapsed time: 1.14296s