=========
UltraNest
=========
Fit and compare complex models reliably and rapidly with advanced sampling techniques.
.. image:: https://img.shields.io/pypi/v/ultranest.svg
:target: https://pypi.python.org/pypi/ultranest
.. image:: https://circleci.com/gh/JohannesBuchner/UltraNest/tree/master.svg?style=shield
:target: https://circleci.com/gh/JohannesBuchner/UltraNest
.. image:: https://img.shields.io/badge/docs-published-ok.svg
:target: https://johannesbuchner.github.io/UltraNest/
:alt: Documentation Status
.. image:: https://img.shields.io/badge/GitHub-JohannesBuchner%2FUltraNest-blue.svg?style=flat
:target: https://github.com/JohannesBuchner/UltraNest/
:alt: Github repository
.. image:: https://joss.theoj.org/papers/10.21105/joss.03001/status.svg
:target: https://doi.org/10.21105/joss.03001
:alt: Software paper
Correctness. Speed. Ease of use. 🦔
About
-----
When scientific models are compared to data, two tasks are important:
1) contraining the model parameters and 2) comparing the model to other models.
Different techniques have been developed to explore model parameter spaces.
This package implements a Monte Carlo technique called nested sampling.
**Nested sampling** allows Bayesian inference on arbitrary user-defined likelihoods.
In particular, posterior probability distributions on model parameters
are constructed, and the marginal likelihood ("evidence") Z is computed.
The former can be used to describe the parameter constraints of the data,
the latter can be used for model comparison (via `Bayes factors`)
as a measure of the prediction parsimony of a model.
In the last decade, multiple variants of nested sampling have been
developed. These differ in how nested sampling finds better and
better fits while respecting the priors
(constrained likelihood prior sampling techniques), and whether it is
allowed to go back to worse fits and explore the parameter space more.
This package develops novel, advanced techniques for both (See
`How it works <https://johannesbuchner.github.io/UltraNest/method.html>`_).
They are especially remarkable for being free of tuning parameters
and theoretically justified. Beyond that, UltraNest has support for
Big Data sets and high-performance computing applications.
UltraNest is intended for fitting complex physical models with slow
likelihood evaluations, with one to hundreds of parameters.
UltraNest intends to replace heuristic methods like multi-ellipsoid
nested sampling and dynamic nested sampling with more rigorous methods.
UltraNest also attempts to provide feature parity compared to other packages
(such as MultiNest).
You can help by testing UltraNest and reporting issues. Code contributions are welcome.
See the `Contributing page <https://johannesbuchner.github.io/UltraNest/contributing.html>`_.
Features
---------
* Pythonic
* pip and conda installable
* Easy to program for: Sanity checks with meaningful errors
* Can control the run programmatically and check status
* Reasonable defaults, but customizable
* Thoroughly tested with many unit and integration tests
* NEW: supports likelihood functions written in `Python <https://github.com/JohannesBuchner/UltraNest/tree/master/languages/python>`_, `C <https://github.com/JohannesBuchner/UltraNest/tree/master/languages/c>`_, `C++ <https://github.com/JohannesBuchner/UltraNest/tree/master/languages/c%2B%2B>`_, `Fortran <https://github.com/JohannesBuchner/UltraNest/tree/master/languages/fortran>`_, `Julia <https://github.com/JohannesBuchner/UltraNest/tree/master/languages/julia>`_ and `R <https://github.com/JohannesBuchner/UltraNest/tree/master/languages/r>`_
* Robust exploration easily handles:
* Degenerate parameter spaces such as bananas or tight correlations
* Multiple modes/solutions in the parameter space
* Robust, parameter-free MLFriends algorithm
(metric learning RadFriends, Buchner+14,+19), with new improvements
(region follows new live points, clustering improves metric iteratively,
NEW in v4.0: refined local metric).
* High-dimensional problems with hit-and-run sampling
* Wrapped/circular parameters, derived parameters
* Fast-slow parameters
* Lightweight and fast
* some functions implemented in Cython
* `vectorized likelihood function calls <https://johannesbuchner.github.io/UltraNest/performance.html>`__,
optimally supporting models with deep learning emulators
* Use multiple cores, fully parallelizable from laptops to computing clusters
* `MPI support <https://johannesbuchner.github.io/UltraNest/performance.html>`__
* Advanced visualisation and crash recovery:
* Live view of the exploration for Jupyter notebooks and terminals
* Publication-ready visualisations
* Corner plots, run and parameter exploration diagnostic plots
* Checkpointing and resuming, even with different number of live points
* `Warm-start: resume from modified data / model <https://johannesbuchner.github.io/UltraNest/example-warmstart.html>`__
* strategic nested sampling
* can vary (increase) number of live points (akin to dynamic nested sampling, but with different targets)
* can sample clusters optimally (e.g., at least 50 points per cluster/mode/solution)
* can target minimizing parameter estimation uncertainties
* can target a desired evidence uncertainty threshold
* can target a desired number of effective samples
* or any combination of the above
* Robust ln(Z) uncertainties by bootstrapping live points.
Usage
^^^^^
* `Get started! <https://johannesbuchner.github.io/UltraNest/using-ultranest.html>`_
* Read the full documentation with tutorials at:
* https://johannesbuchner.github.io/UltraNest/
* `API Reference <https://johannesbuchner.github.io/UltraNest/ultranest.html#ultranest.integrator.ReactiveNestedSampler>`_.
* `Code repository: https://github.com/JohannesBuchner/UltraNest/ <https://github.com/JohannesBuchner/UltraNest/>`_
Licence
^^^^^^^
How to `cite UltraNest <https://johannesbuchner.github.io/UltraNest/issues.html#how-should-i-cite-ultranest>`_.
GPLv3 (see LICENCE file). If you require another license, please contact me.
The cute hedgehog icon was made by `Freepik <https://www.flaticon.com/authors/freepik>`_.
It symbolises UltraNest's approach of carefully walking up a likelihood,
ready to defend against any encountered danger.
Contributors
^^^^^^^^^^^^
* Nicholas Susemiehl
* Quinn Gao
* Sigfried Vanaverbeke
* Warrick Ball
* Adipol Phosrisom
* Pieter Vuylsteke
* Alexander Harvey Nitz
* Gregory David Martinez
* Grigorii Smirnov-Pinchukov
* Fabio F Acero
* Jacopo Tissino
* Benjamin Beauchesne
* Kyle Barbary (some ellipsoid code adopted from https://github.com/kbarbary/nestle)
* Adam Moss (some architecture and parallelisation adopted from https://github.com/adammoss/nnest)
* Josh Speagle (some visualisations adopted from https://github.com/joshspeagle/dynesty/)
* Johannes Buchner
==============
Release Notes
==============
4.2.0 (2024-02-15)
------------------
* new ultranest.mlfriends.LocalAffineLayer for metric learning, set as default (see `issue 124 <https://github.com/JohannesBuchner/UltraNest/issues/124>`_)
* add Highest Density Interval function (ultranest.plot.highest_density_interval_from_samples)
* corner plot style with higher signal-to-ink ratio.
* bug fixes in popstepsampler
4.1.0 (2024-02-15)
------------------
* add number of steps calibrator ultranest.calibrator.ReactiveNestedCalibrator
* add relative jump distance diagnostic for step samplers
* make population step samplers more consistent with other step samplers
4.0.0 (2024-02-15)
------------------
* new ultranest.mlfriends.MaxPrincipleGapAffineLayer for metric learning, set as default
3.6.5 (2023-07-18)
------------------
* documentation improvements
* logging with MPI fixes `by adipol-ph <https://github.com/JohannesBuchner/UltraNest/issues/109>`_ and `by gregorydavidmartinez <https://github.com/JohannesBuchner/UltraNest/issues/110>`_
* more flexible plotting `by facero <https://github.com/JohannesBuchner/UltraNest/issues/108>`_
3.6.0 (2023-06-22)
------------------
* add PopulationRandomWalkSampler: vectorized Gaussian random walks for GPU/JAX-powered likelihoods
* limit initial widening to escape plateau (issue #81)
3.5.0 (2022-09-05)
------------------
* add hot-resume: resume from a similar fit (with different data)
* fix post_summary.csv column order
* fix build handling for non-pip systems (pyproject.toml)
* more efficient handling of categorical variables
3.4.0 (2022-04-05)
------------------
* add differential evolution proposal for slice sampling, recommend it
* fix revert of step sampler when run out of constraint, in MPI
* add SimpleRegion: axis-aligned ellipsoidal for very high-d.
3.3.3 (2021-09-17)
------------------
* pretty marginal posterior plot to stdout
* avoid non-terminations when logzerr cannot be reached
* add RobustEllipsoidRegion: ellipsoidal without MLFriends for high-d.
* add WrappingEllipsoid: for additional rejection.
* bug fixes on rank order test
* add resume-similar
* modular step samplers
3.0.0 (2020-10-03)
------------------
* Accelerated Hit-and-Run Sampler added
* Support for other languages (C, C++, Julia, Fortran) added
* Insertion order test added
* Warm-start added
* Rejection sampling with transformed ellipsoid added
2.2.0 (2020-02-07)
------------------
* allow reading UltraNest outputs without ReactiveNestedSampler instance
2.1.0 (2020-02-07)
------------------
* adaptive number of steps for slice and hit-and-run samplers.
2.0.0 (2019-10-03)
------------------
* First release.
1.0.0 (2014)
------------------
* A simpler version referenced in Buchner et al. (2014),
combining RadFriends with an optional Metropolis-Hastings proposal.
Raw data
{
"_id": null,
"home_page": "https://github.com/JohannesBuchner/ultranest",
"name": "ultranest",
"maintainer": null,
"docs_url": null,
"requires_python": "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,>=2.7",
"maintainer_email": null,
"keywords": "ultranest",
"author": "Johannes Buchner",
"author_email": "johannes.buchner.acad@gmx.com",
"download_url": "https://files.pythonhosted.org/packages/c0/b3/6c8a33eb77d2fadf78a15553ed0eb3c323562ded6de8b28537506fdf678c/ultranest-4.3.3.tar.gz",
"platform": null,
"description": "=========\nUltraNest\n=========\n\nFit and compare complex models reliably and rapidly with advanced sampling techniques.\n\n.. image:: https://img.shields.io/pypi/v/ultranest.svg\n :target: https://pypi.python.org/pypi/ultranest\n\n.. image:: https://circleci.com/gh/JohannesBuchner/UltraNest/tree/master.svg?style=shield\n :target: https://circleci.com/gh/JohannesBuchner/UltraNest\n\n.. image:: https://img.shields.io/badge/docs-published-ok.svg\n :target: https://johannesbuchner.github.io/UltraNest/\n :alt: Documentation Status\n\n.. image:: https://img.shields.io/badge/GitHub-JohannesBuchner%2FUltraNest-blue.svg?style=flat\n :target: https://github.com/JohannesBuchner/UltraNest/\n :alt: Github repository\n\n.. image:: https://joss.theoj.org/papers/10.21105/joss.03001/status.svg\n :target: https://doi.org/10.21105/joss.03001\n :alt: Software paper\n\nCorrectness. Speed. Ease of use. \ud83e\udd94\n\nAbout\n-----\n\nWhen scientific models are compared to data, two tasks are important:\n1) contraining the model parameters and 2) comparing the model to other models.\nDifferent techniques have been developed to explore model parameter spaces.\nThis package implements a Monte Carlo technique called nested sampling.\n\n**Nested sampling** allows Bayesian inference on arbitrary user-defined likelihoods.\nIn particular, posterior probability distributions on model parameters\nare constructed, and the marginal likelihood (\"evidence\") Z is computed.\nThe former can be used to describe the parameter constraints of the data,\nthe latter can be used for model comparison (via `Bayes factors`) \nas a measure of the prediction parsimony of a model.\n\nIn the last decade, multiple variants of nested sampling have been \ndeveloped. These differ in how nested sampling finds better and\nbetter fits while respecting the priors \n(constrained likelihood prior sampling techniques), and whether it is \nallowed to go back to worse fits and explore the parameter space more.\n\nThis package develops novel, advanced techniques for both (See \n`How it works <https://johannesbuchner.github.io/UltraNest/method.html>`_).\nThey are especially remarkable for being free of tuning parameters \nand theoretically justified. Beyond that, UltraNest has support for \nBig Data sets and high-performance computing applications.\n\nUltraNest is intended for fitting complex physical models with slow\nlikelihood evaluations, with one to hundreds of parameters.\nUltraNest intends to replace heuristic methods like multi-ellipsoid\nnested sampling and dynamic nested sampling with more rigorous methods.\nUltraNest also attempts to provide feature parity compared to other packages\n(such as MultiNest).\n\nYou can help by testing UltraNest and reporting issues. Code contributions are welcome.\nSee the `Contributing page <https://johannesbuchner.github.io/UltraNest/contributing.html>`_.\n\nFeatures\n---------\n\n* Pythonic\n\n * pip and conda installable\n * Easy to program for: Sanity checks with meaningful errors\n * Can control the run programmatically and check status\n * Reasonable defaults, but customizable\n * Thoroughly tested with many unit and integration tests\n * NEW: supports likelihood functions written in `Python <https://github.com/JohannesBuchner/UltraNest/tree/master/languages/python>`_, `C <https://github.com/JohannesBuchner/UltraNest/tree/master/languages/c>`_, `C++ <https://github.com/JohannesBuchner/UltraNest/tree/master/languages/c%2B%2B>`_, `Fortran <https://github.com/JohannesBuchner/UltraNest/tree/master/languages/fortran>`_, `Julia <https://github.com/JohannesBuchner/UltraNest/tree/master/languages/julia>`_ and `R <https://github.com/JohannesBuchner/UltraNest/tree/master/languages/r>`_\n\n* Robust exploration easily handles:\n\n * Degenerate parameter spaces such as bananas or tight correlations\n * Multiple modes/solutions in the parameter space\n * Robust, parameter-free MLFriends algorithm \n (metric learning RadFriends, Buchner+14,+19), with new improvements\n (region follows new live points, clustering improves metric iteratively, \n NEW in v4.0: refined local metric).\n * High-dimensional problems with hit-and-run sampling\n * Wrapped/circular parameters, derived parameters\n * Fast-slow parameters\n\n* Lightweight and fast\n\n * some functions implemented in Cython\n * `vectorized likelihood function calls <https://johannesbuchner.github.io/UltraNest/performance.html>`__, \n optimally supporting models with deep learning emulators\n * Use multiple cores, fully parallelizable from laptops to computing clusters\n * `MPI support <https://johannesbuchner.github.io/UltraNest/performance.html>`__\n\n* Advanced visualisation and crash recovery:\n\n * Live view of the exploration for Jupyter notebooks and terminals\n * Publication-ready visualisations\n * Corner plots, run and parameter exploration diagnostic plots\n * Checkpointing and resuming, even with different number of live points\n * `Warm-start: resume from modified data / model <https://johannesbuchner.github.io/UltraNest/example-warmstart.html>`__\n\n* strategic nested sampling\n\n * can vary (increase) number of live points (akin to dynamic nested sampling, but with different targets)\n * can sample clusters optimally (e.g., at least 50 points per cluster/mode/solution)\n * can target minimizing parameter estimation uncertainties\n * can target a desired evidence uncertainty threshold\n * can target a desired number of effective samples\n * or any combination of the above\n * Robust ln(Z) uncertainties by bootstrapping live points.\n\nUsage\n^^^^^\n\n* `Get started! <https://johannesbuchner.github.io/UltraNest/using-ultranest.html>`_\n\n* Read the full documentation with tutorials at:\n\n * https://johannesbuchner.github.io/UltraNest/\n * `API Reference <https://johannesbuchner.github.io/UltraNest/ultranest.html#ultranest.integrator.ReactiveNestedSampler>`_.\n * `Code repository: https://github.com/JohannesBuchner/UltraNest/ <https://github.com/JohannesBuchner/UltraNest/>`_\n\nLicence\n^^^^^^^\n\nHow to `cite UltraNest <https://johannesbuchner.github.io/UltraNest/issues.html#how-should-i-cite-ultranest>`_.\n\nGPLv3 (see LICENCE file). If you require another license, please contact me.\n\nThe cute hedgehog icon was made by `Freepik <https://www.flaticon.com/authors/freepik>`_.\nIt symbolises UltraNest's approach of carefully walking up a likelihood,\nready to defend against any encountered danger.\n\nContributors\n^^^^^^^^^^^^\n\n* Nicholas Susemiehl\n* Quinn Gao\n* Sigfried Vanaverbeke\n* Warrick Ball\n* Adipol Phosrisom\n* Pieter Vuylsteke\n* Alexander Harvey Nitz\n* Gregory David Martinez\n* Grigorii Smirnov-Pinchukov\n* Fabio F Acero\n* Jacopo Tissino\n* Benjamin Beauchesne\n* Kyle Barbary (some ellipsoid code adopted from https://github.com/kbarbary/nestle)\n* Adam Moss (some architecture and parallelisation adopted from https://github.com/adammoss/nnest)\n* Josh Speagle (some visualisations adopted from https://github.com/joshspeagle/dynesty/)\n* Johannes Buchner\n\n\n==============\nRelease Notes\n==============\n\n4.2.0 (2024-02-15)\n------------------\n\n* new ultranest.mlfriends.LocalAffineLayer for metric learning, set as default (see `issue 124 <https://github.com/JohannesBuchner/UltraNest/issues/124>`_)\n* add Highest Density Interval function (ultranest.plot.highest_density_interval_from_samples)\n* corner plot style with higher signal-to-ink ratio.\n* bug fixes in popstepsampler\n\n4.1.0 (2024-02-15)\n------------------\n\n* add number of steps calibrator ultranest.calibrator.ReactiveNestedCalibrator\n* add relative jump distance diagnostic for step samplers\n* make population step samplers more consistent with other step samplers\n\n4.0.0 (2024-02-15)\n------------------\n\n* new ultranest.mlfriends.MaxPrincipleGapAffineLayer for metric learning, set as default\n\n3.6.5 (2023-07-18)\n------------------\n\n* documentation improvements\n* logging with MPI fixes `by adipol-ph <https://github.com/JohannesBuchner/UltraNest/issues/109>`_ and `by gregorydavidmartinez <https://github.com/JohannesBuchner/UltraNest/issues/110>`_\n* more flexible plotting `by facero <https://github.com/JohannesBuchner/UltraNest/issues/108>`_\n\n3.6.0 (2023-06-22)\n------------------\n\n* add PopulationRandomWalkSampler: vectorized Gaussian random walks for GPU/JAX-powered likelihoods\n* limit initial widening to escape plateau (issue #81)\n\n\n3.5.0 (2022-09-05)\n------------------\n\n* add hot-resume: resume from a similar fit (with different data)\n* fix post_summary.csv column order\n* fix build handling for non-pip systems (pyproject.toml)\n* more efficient handling of categorical variables\n\n\n3.4.0 (2022-04-05)\n------------------\n\n* add differential evolution proposal for slice sampling, recommend it\n* fix revert of step sampler when run out of constraint, in MPI\n* add SimpleRegion: axis-aligned ellipsoidal for very high-d.\n\n\n3.3.3 (2021-09-17)\n------------------\n\n* pretty marginal posterior plot to stdout\n* avoid non-terminations when logzerr cannot be reached\n* add RobustEllipsoidRegion: ellipsoidal without MLFriends for high-d.\n* add WrappingEllipsoid: for additional rejection.\n* bug fixes on rank order test\n* add resume-similar\n* modular step samplers\n\n\n3.0.0 (2020-10-03)\n------------------\n\n* Accelerated Hit-and-Run Sampler added\n* Support for other languages (C, C++, Julia, Fortran) added\n* Insertion order test added\n* Warm-start added\n* Rejection sampling with transformed ellipsoid added\n\n2.2.0 (2020-02-07)\n------------------\n\n* allow reading UltraNest outputs without ReactiveNestedSampler instance\n\n2.1.0 (2020-02-07)\n------------------\n\n* adaptive number of steps for slice and hit-and-run samplers.\n\n2.0.0 (2019-10-03)\n------------------\n\n* First release.\n\n1.0.0 (2014)\n------------------\n\n* A simpler version referenced in Buchner et al. (2014),\n combining RadFriends with an optional Metropolis-Hastings proposal.\n",
"bugtrack_url": null,
"license": "GNU General Public License v3",
"summary": "Fit and compare complex models reliably and rapidly. Advanced Nested Sampling.",
"version": "4.3.3",
"project_urls": {
"Homepage": "https://github.com/JohannesBuchner/ultranest"
},
"split_keywords": [
"ultranest"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "c0b36c8a33eb77d2fadf78a15553ed0eb3c323562ded6de8b28537506fdf678c",
"md5": "81143089b64d0fa36d355366a18dd5e8",
"sha256": "bea7e32d1c5e4d984402b1e40fc48c2936513c2d37ed7cf4395a7b3edcbb03ec"
},
"downloads": -1,
"filename": "ultranest-4.3.3.tar.gz",
"has_sig": false,
"md5_digest": "81143089b64d0fa36d355366a18dd5e8",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,>=2.7",
"size": 2662798,
"upload_time": "2024-10-26T18:40:53",
"upload_time_iso_8601": "2024-10-26T18:40:53.791007Z",
"url": "https://files.pythonhosted.org/packages/c0/b3/6c8a33eb77d2fadf78a15553ed0eb3c323562ded6de8b28537506fdf678c/ultranest-4.3.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-26 18:40:53",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "JohannesBuchner",
"github_project": "ultranest",
"travis_ci": false,
"coveralls": true,
"github_actions": true,
"circle": true,
"tox": true,
"lcname": "ultranest"
}