[![PyPI](https://img.shields.io/pypi/v/optimas)](https://pypi.org/project/optimas/)
[![Conda Version](https://img.shields.io/conda/vn/conda-forge/optimas.svg)](https://anaconda.org/conda-forge/optimas)
[![tests badge](https://github.com/optimas-org/optimas/actions/workflows/unix.yml/badge.svg)](https://github.com/optimas-org/optimas/actions)
[![Documentation Status](https://readthedocs.org/projects/optimas/badge/?version=latest)](https://optimas.readthedocs.io/en/latest/?badge=latest)
[![DOI](https://zenodo.org/badge/287560975.svg)](https://zenodo.org/badge/latestdoi/287560975)
[![License](https://img.shields.io/pypi/l/optimas.svg)](license.txt)
<!-- PROJECT LOGO -->
<br />
<div align="center">
<a href="https://github.com/othneildrew/Best-README-Template">
<img src="https://user-images.githubusercontent.com/20479420/219680583-34ac9525-7715-4e2a-b4fe-74848e9f59b2.png" alt="optimas logo" width="350">
</a>
<h3 align="center">
Optimization at scale, powered by
<a href="https://libensemble.readthedocs.io/"><strong>libEnsemble</strong></a>
</h3>
<p align="center">
<a href="https://optimas.readthedocs.io/"><strong>Explore the docs »</strong></a>
<br />
<br />
<a href="https://optimas.readthedocs.io/en/latest/examples/index.html">View Examples</a>
·
<a href="https://optimas-group.slack.com/">Support</a>
·
<a href="https://optimas.readthedocs.io/en/latest/api/index.html">API Reference</a>
</p>
</div>
Optimas is a Python library designed for highly scalable optimization, from laptops to massively-parallel supercomputers.
## Key Features
- **Scalability**: Leveraging the power of [libEnsemble](https://github.com/Libensemble/libensemble), Optimas is designed to scale seamlessly from your laptop to high-performance computing clusters.
- **User-Friendly**: Optimas simplifies the process of running large parallel parameter scans and optimizations. Specify the number of parallel evaluations and the computing resources to allocate to each of them and Optimas will handle the rest.
- **Advanced Optimization**: Optimas integrates algorithms from the [Ax](https://github.com/facebook/Ax) library, offering both single- and multi-objective Bayesian optimization. This includes advanced techniques such as multi-fidelity and multi-task algorithms.
## Installation
You can install Optimas from PyPI (recommended):
```sh
python -m pip install "optimas[all]"
```
from conda-forge:
```sh
conda install optimas --channel conda-forge
```
or directly from GitHub:
```sh
python -m pip install "optimas[all] @ git+https://github.com/optimas-org/optimas.git"
```
Make sure `mpi4py` is available in your environment before installing optimas. For more details, check out the full [installation guide](https://optimas.readthedocs.io/en/latest/user_guide/installation_local.html). We have also prepared dedicated installation instructions for some HPC systems such as
[JUWELS (JSC)](https://optimas.readthedocs.io/en/latest/user_guide/installation_juwels.html),
[Maxwell (DESY)](https://optimas.readthedocs.io/en/latest/user_guide/installation_maxwell.html) and
[Perlmutter (NERSC)](https://optimas.readthedocs.io/en/latest/user_guide/installation_perlmutter.html).
## Documentation
For more information on how to use Optimas, check out the [documentation](https://optimas.readthedocs.io/). You'll find installation instructions, a user guide, [examples](https://optimas.readthedocs.io/en/latest/examples/index.html) and the API reference.
## Support
Need more help? Join our [Slack channel](https://optimas-group.slack.com/) or open a [new issue](https://github.com/optimas-org/optimas/issues/new/choose).
## Citing optimas
If your usage of Optimas leads to a scientific publication, please consider citing the original [paper](https://link.aps.org/doi/10.1103/PhysRevAccelBeams.26.084601):
```bibtex
@article{PhysRevAccelBeams.26.084601,
title = {Bayesian optimization of laser-plasma accelerators assisted by reduced physical models},
author = {Ferran Pousa, A. and Jalas, S. and Kirchen, M. and Martinez de la Ossa, A. and Th\'evenet, M. and Hudson, S. and Larson, J. and Huebl, A. and Vay, J.-L. and Lehe, R.},
journal = {Phys. Rev. Accel. Beams},
volume = {26},
issue = {8},
pages = {084601},
numpages = {9},
year = {2023},
month = {Aug},
publisher = {American Physical Society},
doi = {10.1103/PhysRevAccelBeams.26.084601},
url = {https://link.aps.org/doi/10.1103/PhysRevAccelBeams.26.084601}
}
```
and libEnsemble:
```bibtex
@article{Hudson2022,
title = {{libEnsemble}: A Library to Coordinate the Concurrent
Evaluation of Dynamic Ensembles of Calculations},
author = {Stephen Hudson and Jeffrey Larson and John-Luke Navarro and Stefan M. Wild},
journal = {{IEEE} Transactions on Parallel and Distributed Systems},
volume = {33},
number = {4},
pages = {977--988},
year = {2022},
doi = {10.1109/tpds.2021.3082815}
}
```
Raw data
{
"_id": null,
"home_page": null,
"name": "optimas",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "optimization, scale, bayesian",
"author": null,
"author_email": "Optimas Developers <angel.ferran.pousa@desy.de>",
"download_url": "https://files.pythonhosted.org/packages/c3/4a/259f6d581653d891a251a89cd232b78216aa4e5bc34dce90880a349161a8/optimas-0.7.1.tar.gz",
"platform": null,
"description": "[![PyPI](https://img.shields.io/pypi/v/optimas)](https://pypi.org/project/optimas/)\n[![Conda Version](https://img.shields.io/conda/vn/conda-forge/optimas.svg)](https://anaconda.org/conda-forge/optimas)\n[![tests badge](https://github.com/optimas-org/optimas/actions/workflows/unix.yml/badge.svg)](https://github.com/optimas-org/optimas/actions)\n[![Documentation Status](https://readthedocs.org/projects/optimas/badge/?version=latest)](https://optimas.readthedocs.io/en/latest/?badge=latest)\n[![DOI](https://zenodo.org/badge/287560975.svg)](https://zenodo.org/badge/latestdoi/287560975)\n[![License](https://img.shields.io/pypi/l/optimas.svg)](license.txt)\n\n<!-- PROJECT LOGO -->\n<br />\n<div align=\"center\">\n <a href=\"https://github.com/othneildrew/Best-README-Template\">\n <img src=\"https://user-images.githubusercontent.com/20479420/219680583-34ac9525-7715-4e2a-b4fe-74848e9f59b2.png\" alt=\"optimas logo\" width=\"350\">\n </a>\n\n <h3 align=\"center\">\n Optimization at scale, powered by\n <a href=\"https://libensemble.readthedocs.io/\"><strong>libEnsemble</strong></a>\n </h3>\n\n <p align=\"center\">\n <a href=\"https://optimas.readthedocs.io/\"><strong>Explore the docs \u00bb</strong></a>\n <br />\n <br />\n <a href=\"https://optimas.readthedocs.io/en/latest/examples/index.html\">View Examples</a>\n \u00b7\n <a href=\"https://optimas-group.slack.com/\">Support</a>\n \u00b7\n <a href=\"https://optimas.readthedocs.io/en/latest/api/index.html\">API Reference</a>\n </p>\n</div>\n\nOptimas is a Python library designed for highly scalable optimization, from laptops to massively-parallel supercomputers.\n\n\n## Key Features\n\n- **Scalability**: Leveraging the power of [libEnsemble](https://github.com/Libensemble/libensemble), Optimas is designed to scale seamlessly from your laptop to high-performance computing clusters.\n- **User-Friendly**: Optimas simplifies the process of running large parallel parameter scans and optimizations. Specify the number of parallel evaluations and the computing resources to allocate to each of them and Optimas will handle the rest.\n- **Advanced Optimization**: Optimas integrates algorithms from the [Ax](https://github.com/facebook/Ax) library, offering both single- and multi-objective Bayesian optimization. This includes advanced techniques such as multi-fidelity and multi-task algorithms.\n\n\n## Installation\nYou can install Optimas from PyPI (recommended):\n```sh\npython -m pip install \"optimas[all]\"\n```\nfrom conda-forge:\n```sh\nconda install optimas --channel conda-forge\n```\nor directly from GitHub:\n```sh\npython -m pip install \"optimas[all] @ git+https://github.com/optimas-org/optimas.git\"\n```\nMake sure `mpi4py` is available in your environment before installing optimas. For more details, check out the full [installation guide](https://optimas.readthedocs.io/en/latest/user_guide/installation_local.html). We have also prepared dedicated installation instructions for some HPC systems such as\n[JUWELS (JSC)](https://optimas.readthedocs.io/en/latest/user_guide/installation_juwels.html),\n[Maxwell (DESY)](https://optimas.readthedocs.io/en/latest/user_guide/installation_maxwell.html) and\n[Perlmutter (NERSC)](https://optimas.readthedocs.io/en/latest/user_guide/installation_perlmutter.html).\n\n\n## Documentation\nFor more information on how to use Optimas, check out the [documentation](https://optimas.readthedocs.io/). You'll find installation instructions, a user guide, [examples](https://optimas.readthedocs.io/en/latest/examples/index.html) and the API reference.\n\n\n## Support\nNeed more help? Join our [Slack channel](https://optimas-group.slack.com/) or open a [new issue](https://github.com/optimas-org/optimas/issues/new/choose).\n\n\n## Citing optimas\nIf your usage of Optimas leads to a scientific publication, please consider citing the original [paper](https://link.aps.org/doi/10.1103/PhysRevAccelBeams.26.084601):\n```bibtex\n@article{PhysRevAccelBeams.26.084601,\n title = {Bayesian optimization of laser-plasma accelerators assisted by reduced physical models},\n author = {Ferran Pousa, A. and Jalas, S. and Kirchen, M. and Martinez de la Ossa, A. and Th\\'evenet, M. and Hudson, S. and Larson, J. and Huebl, A. and Vay, J.-L. and Lehe, R.},\n journal = {Phys. Rev. Accel. Beams},\n volume = {26},\n issue = {8},\n pages = {084601},\n numpages = {9},\n year = {2023},\n month = {Aug},\n publisher = {American Physical Society},\n doi = {10.1103/PhysRevAccelBeams.26.084601},\n url = {https://link.aps.org/doi/10.1103/PhysRevAccelBeams.26.084601}\n}\n```\nand libEnsemble:\n```bibtex\n@article{Hudson2022,\n title = {{libEnsemble}: A Library to Coordinate the Concurrent\n Evaluation of Dynamic Ensembles of Calculations},\n author = {Stephen Hudson and Jeffrey Larson and John-Luke Navarro and Stefan M. Wild},\n journal = {{IEEE} Transactions on Parallel and Distributed Systems},\n volume = {33},\n number = {4},\n pages = {977--988},\n year = {2022},\n doi = {10.1109/tpds.2021.3082815}\n}\n```\n",
"bugtrack_url": null,
"license": "BSD-3-Clause-LBNL",
"summary": "Optimization at scale, powered by libEnsemble",
"version": "0.7.1",
"project_urls": {
"Documentation": "https://optimas.readthedocs.io/"
},
"split_keywords": [
"optimization",
" scale",
" bayesian"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "33fba01aac9146537b053cdf37f442f7a555c6f501a776d01298436646125216",
"md5": "c7e73d4197eb4ba8b6d7b269658e26a8",
"sha256": "c351b679d63ccd13dfd9fbf9dcfab8c3f97932d4a401fab5b71cea0142e0dda7"
},
"downloads": -1,
"filename": "optimas-0.7.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "c7e73d4197eb4ba8b6d7b269658e26a8",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 74895,
"upload_time": "2024-09-20T21:16:08",
"upload_time_iso_8601": "2024-09-20T21:16:08.536169Z",
"url": "https://files.pythonhosted.org/packages/33/fb/a01aac9146537b053cdf37f442f7a555c6f501a776d01298436646125216/optimas-0.7.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "c34a259f6d581653d891a251a89cd232b78216aa4e5bc34dce90880a349161a8",
"md5": "8161fb125bfba1ea0b4b027ff476250f",
"sha256": "257a5bea86b5c730f881768c2cdc0a46e7ffbdf7d777777f08ca8e78f2def1ef"
},
"downloads": -1,
"filename": "optimas-0.7.1.tar.gz",
"has_sig": false,
"md5_digest": "8161fb125bfba1ea0b4b027ff476250f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 70893,
"upload_time": "2024-09-20T21:16:10",
"upload_time_iso_8601": "2024-09-20T21:16:10.093869Z",
"url": "https://files.pythonhosted.org/packages/c3/4a/259f6d581653d891a251a89cd232b78216aa4e5bc34dce90880a349161a8/optimas-0.7.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-20 21:16:10",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "optimas"
}