lmc


Namelmc JSON
Version 0.2.2 PyPI version JSON
download
home_pagehttps://github.com/abmantz/lmc
SummaryLogarithmantic Monte Carlo
upload_time2024-08-12 23:14:25
maintainerNone
docs_urlNone
authorAdam Mantz
requires_pythonNone
licenseLGPL-3.0
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            .. image:: https://img.shields.io/badge/ascl-1706.005-blue.svg?colorB=262255
   :alt: ascl:1706.005
   :target: http://ascl.net/1706.005
.. image:: https://img.shields.io/pypi/v/lmc.svg
   :alt: PyPi
   :target: https://pypi.python.org/pypi/lmc
.. image:: https://img.shields.io/pypi/l/lmc.svg
   :alt: LGPL-3.0
   :target: https://www.gnu.org/licenses/lgpl-3.0.txt

=====================================================================================
Logarithmantic Monte Carlo (LMC)
=====================================================================================

----------------------------------------
Python code for Markov Chain Monte Carlo
----------------------------------------

`Logarithmancy <https://en.wiktionary.org/wiki/logarithmancy>`_ (n): divination by means of algorithms

What is this?
=============

``LMC`` (not to be confused with the Large Magellanic Cloud) is a bundle of Python code for performing Markov Chain Monte Carlo, which implements a few different multidimensional proposal strategies and (optionally parallel) adaptation methods. There are similar packages out there, notably `pymc <https://github.com/pymc-devs/pymc>`_ - ``LMC`` exists because I found the alternatives to be too inflexible for the work I was doing at the time. On the off chance that someone else is in the same boat, here it is.

The samplers currently included are Metropolis, slice, and the affine-invariant sampler popularized by `emcee <http://dan.iel.fm/emcee>`_ (`Goodman & Weare 2010 <http://dx.doi.org/10.2140/camcos.2010.5.65>`_).

An abridged description of the package (from the `help` function) is copied here::

 The module should be very flexible, but is designed with these things foremost in mind:
  1. use with expensive likelihood calculations which probably have a host of hard-to-modify
     code associated with them.
  2. making it straightforward to break the parameter space into subspaces which can be sampled
     using different proposal methods and at different rates. For example, if changing some
     parameters requires very expensive calulations in the likelihood, the other, faster
     parameters can be sampled at a higher rate. Or, some parameters may lend themselves to
     Gibbs sampling, while others may not, and these can be block updated independently.
  3. keeping the overhead low to facilitate large numbers of parameters. Some of this has been
     lost in the port from C++, but, for example, the package provides automatic tuning of the
     proposal covariance for block updating without needing to store traces of the parameters in
     memory.

 Real-valued parameters are usually assumed, but the framework can be used with other types of 
 parameters, with suitable overloading of classes.

 A byproduct of item (1) is that the user is expected to handle all aspects of the calculation of 
 the posterior. The module doesn't implement assignment of canned, standard priors, or automatic 
 discovery of shortcuts like conjugate Gibbs sampling. The idea is that the user is in the best 
 position to know how the details of the likelihood and priors should be implemented.

 Communication between parallel chains can significantly speed up convergence. In parallel mode, 
 adaptive Updaters use information from all running chains to tune their proposals, rather than 
 only from their own chain. The Gelman-Rubin convergence criterion (ratio of inter- to intra-chain 
 variances) for each free parameter is also calculated. Parallelization is implemented in two ways; 
 see ?Updater for instructions on using each.
  1. Via MPI (using mpi4py). MPI adaptations are synchronous: when a chain reaches a communication
     point, it stops until all chains have caught up.
  2. Via the filesystem. When a chain adapts, it will write its covariance information to a file. It
     will then read in any information from other chains that is present in similar files, and
     incorporate it when tuning. This process is asynchronous; chains will not wait for one another; 
     they will simply adapt using whatever information has been shared at the time. 


Installation
============

Automatic
---------

Install from PyPI by running ``pip install lmc``.

Manual
------

Download ``lmc/lmc.py`` and put it somewhere on your ``PYTHONPATH``. You will need to have the ``numpy`` package installed. The ``mpi4py`` package is optional, but highly recommended.

Usage and Help
==============

Documentation can be found throughout ``lmc.py``, mostly in the form of docstrings, so it's also available through the Python interpreter. There's also a ``help()`` function (near the top of the file, if you're browsing) and an ``example()`` function (near the bottom).

The examples can also be browsed `here <https://github.com/abmantz/lmc/tree/master/examples>`_.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/abmantz/lmc",
    "name": "lmc",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": null,
    "author": "Adam Mantz",
    "author_email": "amantz@slac.stanford.edu",
    "download_url": "https://files.pythonhosted.org/packages/0d/83/9df9263570101762db33406a1e944478e46311745fe866016ab9bfb81687/lmc-0.2.2.tar.gz",
    "platform": null,
    "description": ".. image:: https://img.shields.io/badge/ascl-1706.005-blue.svg?colorB=262255\n   :alt: ascl:1706.005\n   :target: http://ascl.net/1706.005\n.. image:: https://img.shields.io/pypi/v/lmc.svg\n   :alt: PyPi\n   :target: https://pypi.python.org/pypi/lmc\n.. image:: https://img.shields.io/pypi/l/lmc.svg\n   :alt: LGPL-3.0\n   :target: https://www.gnu.org/licenses/lgpl-3.0.txt\n\n=====================================================================================\nLogarithmantic Monte Carlo (LMC)\n=====================================================================================\n\n----------------------------------------\nPython code for Markov Chain Monte Carlo\n----------------------------------------\n\n`Logarithmancy <https://en.wiktionary.org/wiki/logarithmancy>`_ (n): divination by means of algorithms\n\nWhat is this?\n=============\n\n``LMC`` (not to be confused with the Large Magellanic Cloud) is a bundle of Python code for performing Markov Chain Monte Carlo, which implements a few different multidimensional proposal strategies and (optionally parallel) adaptation methods. There are similar packages out there, notably `pymc <https://github.com/pymc-devs/pymc>`_ - ``LMC`` exists because I found the alternatives to be too inflexible for the work I was doing at the time. On the off chance that someone else is in the same boat, here it is.\n\nThe samplers currently included are Metropolis, slice, and the affine-invariant sampler popularized by `emcee <http://dan.iel.fm/emcee>`_ (`Goodman & Weare 2010 <http://dx.doi.org/10.2140/camcos.2010.5.65>`_).\n\nAn abridged description of the package (from the `help` function) is copied here::\n\n The module should be very flexible, but is designed with these things foremost in mind:\n  1. use with expensive likelihood calculations which probably have a host of hard-to-modify\n     code associated with them.\n  2. making it straightforward to break the parameter space into subspaces which can be sampled\n     using different proposal methods and at different rates. For example, if changing some\n     parameters requires very expensive calulations in the likelihood, the other, faster\n     parameters can be sampled at a higher rate. Or, some parameters may lend themselves to\n     Gibbs sampling, while others may not, and these can be block updated independently.\n  3. keeping the overhead low to facilitate large numbers of parameters. Some of this has been\n     lost in the port from C++, but, for example, the package provides automatic tuning of the\n     proposal covariance for block updating without needing to store traces of the parameters in\n     memory.\n\n Real-valued parameters are usually assumed, but the framework can be used with other types of \n parameters, with suitable overloading of classes.\n\n A byproduct of item (1) is that the user is expected to handle all aspects of the calculation of \n the posterior. The module doesn't implement assignment of canned, standard priors, or automatic \n discovery of shortcuts like conjugate Gibbs sampling. The idea is that the user is in the best \n position to know how the details of the likelihood and priors should be implemented.\n\n Communication between parallel chains can significantly speed up convergence. In parallel mode, \n adaptive Updaters use information from all running chains to tune their proposals, rather than \n only from their own chain. The Gelman-Rubin convergence criterion (ratio of inter- to intra-chain \n variances) for each free parameter is also calculated. Parallelization is implemented in two ways; \n see ?Updater for instructions on using each.\n  1. Via MPI (using mpi4py). MPI adaptations are synchronous: when a chain reaches a communication\n     point, it stops until all chains have caught up.\n  2. Via the filesystem. When a chain adapts, it will write its covariance information to a file. It\n     will then read in any information from other chains that is present in similar files, and\n     incorporate it when tuning. This process is asynchronous; chains will not wait for one another; \n     they will simply adapt using whatever information has been shared at the time. \n\n\nInstallation\n============\n\nAutomatic\n---------\n\nInstall from PyPI by running ``pip install lmc``.\n\nManual\n------\n\nDownload ``lmc/lmc.py`` and put it somewhere on your ``PYTHONPATH``. You will need to have the ``numpy`` package installed. The ``mpi4py`` package is optional, but highly recommended.\n\nUsage and Help\n==============\n\nDocumentation can be found throughout ``lmc.py``, mostly in the form of docstrings, so it's also available through the Python interpreter. There's also a ``help()`` function (near the top of the file, if you're browsing) and an ``example()`` function (near the bottom).\n\nThe examples can also be browsed `here <https://github.com/abmantz/lmc/tree/master/examples>`_.\n",
    "bugtrack_url": null,
    "license": "LGPL-3.0",
    "summary": "Logarithmantic Monte Carlo",
    "version": "0.2.2",
    "project_urls": {
        "Homepage": "https://github.com/abmantz/lmc"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2558f71454f901a80e56ccbf7393188afddc57db3ddbd691118d590364c80344",
                "md5": "62943506e5fd89724fcd8e9bad98ad9f",
                "sha256": "8bcca7ae8b6f5396ea2a71cd300df079295895477539efca8666aeb7ff6a18a5"
            },
            "downloads": -1,
            "filename": "lmc-0.2.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "62943506e5fd89724fcd8e9bad98ad9f",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 24651,
            "upload_time": "2024-08-12T23:14:24",
            "upload_time_iso_8601": "2024-08-12T23:14:24.459492Z",
            "url": "https://files.pythonhosted.org/packages/25/58/f71454f901a80e56ccbf7393188afddc57db3ddbd691118d590364c80344/lmc-0.2.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0d839df9263570101762db33406a1e944478e46311745fe866016ab9bfb81687",
                "md5": "99409a5c9da0c3f23f4943ee4eeac83d",
                "sha256": "51a3f2f059dcddfb354399892e2ba5f43019d3fd8031ec71e0c1bd70d03af196"
            },
            "downloads": -1,
            "filename": "lmc-0.2.2.tar.gz",
            "has_sig": false,
            "md5_digest": "99409a5c9da0c3f23f4943ee4eeac83d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 26471,
            "upload_time": "2024-08-12T23:14:25",
            "upload_time_iso_8601": "2024-08-12T23:14:25.797719Z",
            "url": "https://files.pythonhosted.org/packages/0d/83/9df9263570101762db33406a1e944478e46311745fe866016ab9bfb81687/lmc-0.2.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-08-12 23:14:25",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "abmantz",
    "github_project": "lmc",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "lmc"
}
        
Elapsed time: 0.74736s