[![Support Ukraine](https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat&labelColor=005BBB)](https://opensource.fb.com/support-ukraine) [![CircleCI](https://circleci.com/gh/facebookresearch/nevergrad/tree/main.svg?style=svg)](https://circleci.com/gh/facebookresearch/nevergrad/tree/main)
# Nevergrad - A gradient-free optimization platform
![Nevergrad](https://raw.githubusercontent.com/facebookresearch/nevergrad/1.0.5/docs/resources/Nevergrad-LogoMark.png)
`nevergrad` is a Python 3.8+ library. It can be installed with:
```
pip install nevergrad
```
More installation options, including windows installation, and complete instructions are available in the "Getting started" section of the [**documentation**](https://facebookresearch.github.io/nevergrad/).
You can join Nevergrad users Facebook group [here](https://www.facebook.com/groups/nevergradusers/).
Minimizing a function using an optimizer (here `NGOpt`) is straightforward:
```python
import nevergrad as ng
def square(x):
return sum((x - .5)**2)
optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
recommendation = optimizer.minimize(square)
print(recommendation.value) # recommended value
>>> [0.49971112 0.5002944]
```
`nevergrad` can also support bounded continuous variables as well as discrete variables, and mixture of those.
To do this, one can specify the input space:
```python
import nevergrad as ng
def fake_training(learning_rate: float, batch_size: int, architecture: str) -> float:
# optimal for learning_rate=0.2, batch_size=4, architecture="conv"
return (learning_rate - 0.2)**2 + (batch_size - 4)**2 + (0 if architecture == "conv" else 10)
# Instrumentation class is used for functions with multiple inputs
# (positional and/or keywords)
parametrization = ng.p.Instrumentation(
# a log-distributed scalar between 0.001 and 1.0
learning_rate=ng.p.Log(lower=0.001, upper=1.0),
# an integer from 1 to 12
batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),
# either "conv" or "fc"
architecture=ng.p.Choice(["conv", "fc"])
)
optimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)
recommendation = optimizer.minimize(fake_training)
# show the recommended keyword arguments of the function
print(recommendation.kwargs)
>>> {'learning_rate': 0.1998, 'batch_size': 4, 'architecture': 'conv'}
```
Learn more on parametrization in the [**documentation**](https://facebookresearch.github.io/nevergrad/)!
![Example of optimization](https://raw.githubusercontent.com/facebookresearch/nevergrad/1.0.5/docs/resources/TwoPointsDE.gif)
*Convergence of a population of points to the minima with two-points DE.*
## Documentation
Check out our [**documentation**](https://facebookresearch.github.io/nevergrad/)! It's still a work in progress, so don't hesitate to submit issues and/or pull requests (PRs) to update it and make it clearer!
The last version of our [**data**](https://drive.google.com/file/d/1p8d1bMCDlvWrDIMXP7fT9pJa1cgjH3NM/view?usp=sharing) and the last version of our [**PDF report**](https://tinyurl.com/dagstuhloid).
## Citing
```bibtex
@misc{nevergrad,
author = {J. Rapin and O. Teytaud},
title = {{Nevergrad - A gradient-free optimization platform}},
year = {2018},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://GitHub.com/FacebookResearch/Nevergrad}},
}
```
## License
`nevergrad` is released under the MIT license. See [LICENSE](https://github.com/facebookresearch/nevergrad/blob/1.0.5/LICENSE) for additional details about it.
See also our [Terms of Use](https://opensource.facebook.com/legal/terms) and [Privacy Policy](https://opensource.facebook.com/legal/privacy).
Raw data
{
"_id": null,
"home_page": "https://github.com/facebookresearch/nevergrad",
"name": "nevergrad",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": null,
"keywords": null,
"author": "Facebook AI Research",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/00/f3/5f804311ba9c4e617ad419660c7597690123335daf90a0e5eac71fd1c9b5/nevergrad-1.0.5.tar.gz",
"platform": null,
"description": "[![Support Ukraine](https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat&labelColor=005BBB)](https://opensource.fb.com/support-ukraine) [![CircleCI](https://circleci.com/gh/facebookresearch/nevergrad/tree/main.svg?style=svg)](https://circleci.com/gh/facebookresearch/nevergrad/tree/main)\n\n# Nevergrad - A gradient-free optimization platform\n\n![Nevergrad](https://raw.githubusercontent.com/facebookresearch/nevergrad/1.0.5/docs/resources/Nevergrad-LogoMark.png)\n\n\n`nevergrad` is a Python 3.8+ library. It can be installed with:\n\n```\npip install nevergrad\n```\n\nMore installation options, including windows installation, and complete instructions are available in the \"Getting started\" section of the [**documentation**](https://facebookresearch.github.io/nevergrad/).\n\nYou can join Nevergrad users Facebook group [here](https://www.facebook.com/groups/nevergradusers/).\n\nMinimizing a function using an optimizer (here `NGOpt`) is straightforward:\n\n```python\nimport nevergrad as ng\n\ndef square(x):\n return sum((x - .5)**2)\n\noptimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)\nrecommendation = optimizer.minimize(square)\nprint(recommendation.value) # recommended value\n>>> [0.49971112 0.5002944]\n```\n\n`nevergrad` can also support bounded continuous variables as well as discrete variables, and mixture of those.\nTo do this, one can specify the input space:\n\n```python\nimport nevergrad as ng\n\ndef fake_training(learning_rate: float, batch_size: int, architecture: str) -> float:\n # optimal for learning_rate=0.2, batch_size=4, architecture=\"conv\"\n return (learning_rate - 0.2)**2 + (batch_size - 4)**2 + (0 if architecture == \"conv\" else 10)\n\n# Instrumentation class is used for functions with multiple inputs\n# (positional and/or keywords)\nparametrization = ng.p.Instrumentation(\n # a log-distributed scalar between 0.001 and 1.0\n learning_rate=ng.p.Log(lower=0.001, upper=1.0),\n # an integer from 1 to 12\n batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),\n # either \"conv\" or \"fc\"\n architecture=ng.p.Choice([\"conv\", \"fc\"])\n)\n\noptimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)\nrecommendation = optimizer.minimize(fake_training)\n\n# show the recommended keyword arguments of the function\nprint(recommendation.kwargs)\n>>> {'learning_rate': 0.1998, 'batch_size': 4, 'architecture': 'conv'}\n```\n\nLearn more on parametrization in the [**documentation**](https://facebookresearch.github.io/nevergrad/)!\n\n![Example of optimization](https://raw.githubusercontent.com/facebookresearch/nevergrad/1.0.5/docs/resources/TwoPointsDE.gif)\n\n*Convergence of a population of points to the minima with two-points DE.*\n\n\n## Documentation\n\nCheck out our [**documentation**](https://facebookresearch.github.io/nevergrad/)! It's still a work in progress, so don't hesitate to submit issues and/or pull requests (PRs) to update it and make it clearer!\nThe last version of our [**data**](https://drive.google.com/file/d/1p8d1bMCDlvWrDIMXP7fT9pJa1cgjH3NM/view?usp=sharing) and the last version of our [**PDF report**](https://tinyurl.com/dagstuhloid). \n\n## Citing\n\n```bibtex\n@misc{nevergrad,\n author = {J. Rapin and O. Teytaud},\n title = {{Nevergrad - A gradient-free optimization platform}},\n year = {2018},\n publisher = {GitHub},\n journal = {GitHub repository},\n howpublished = {\\url{https://GitHub.com/FacebookResearch/Nevergrad}},\n}\n```\n\n## License\n\n`nevergrad` is released under the MIT license. See [LICENSE](https://github.com/facebookresearch/nevergrad/blob/1.0.5/LICENSE) for additional details about it.\nSee also our [Terms of Use](https://opensource.facebook.com/legal/terms) and [Privacy Policy](https://opensource.facebook.com/legal/privacy).\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "A Python toolbox for performing gradient-free optimization",
"version": "1.0.5",
"project_urls": {
"Homepage": "https://github.com/facebookresearch/nevergrad"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "a94e3d02e74c06ce2eeb6d39b7654e0b527ae4620debf3f85205522c20b9d2ab",
"md5": "a7fa4ecc71afc36b8bc158365db98a84",
"sha256": "bfabc7a45cf172aef551c13892329a2ecd30f18cc5493aef4dc7cb84180140bb"
},
"downloads": -1,
"filename": "nevergrad-1.0.5-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a7fa4ecc71afc36b8bc158365db98a84",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.6",
"size": 495718,
"upload_time": "2024-09-25T16:43:58",
"upload_time_iso_8601": "2024-09-25T16:43:58.332916Z",
"url": "https://files.pythonhosted.org/packages/a9/4e/3d02e74c06ce2eeb6d39b7654e0b527ae4620debf3f85205522c20b9d2ab/nevergrad-1.0.5-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "00f35f804311ba9c4e617ad419660c7597690123335daf90a0e5eac71fd1c9b5",
"md5": "c9e06f0e723e61c27ac0c3d801803507",
"sha256": "c341c767067543ada280669118cfc1d2db7eb2610bedb4b7cc3d6ae7ea98955d"
},
"downloads": -1,
"filename": "nevergrad-1.0.5.tar.gz",
"has_sig": false,
"md5_digest": "c9e06f0e723e61c27ac0c3d801803507",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6",
"size": 403155,
"upload_time": "2024-09-25T16:44:00",
"upload_time_iso_8601": "2024-09-25T16:44:00.287987Z",
"url": "https://files.pythonhosted.org/packages/00/f3/5f804311ba9c4e617ad419660c7597690123335daf90a0e5eac71fd1c9b5/nevergrad-1.0.5.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-25 16:44:00",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "facebookresearch",
"github_project": "nevergrad",
"travis_ci": false,
"coveralls": true,
"github_actions": false,
"circle": true,
"lcname": "nevergrad"
}